introduction
In enterprise-level Kubernetes environments, Harbor is often used to store Helm Charts as the mainstream mirror and Helm Chart management tool. However, in migration, backup, or offline deployment scenarios, batch pulling Charts in Harbor may face inefficiency or permission issues. This article will be passedHelm CLI + Shell ScriptsThe combination solution realizes one-click batch export operation.
1. Environment preparation and Helm warehouse configuration
1.1 Adding Harbor Repository
To associate the Harbor repository through the Helm command line tool, you need to specify the project path and authentication information:
# Configure variables (modify according to actual environment)
HARBOR_URL=""
PROJECT_NAME="your-project"
USERNAME="admin"
PASSWORD="your-password"
# Add Helm Repository
helm repo add harbor-repo ${HARBOR_URL}/chartrepo/${PROJECT_NAME} \
--username=${USERNAME} \
--password=${PASSWORD}
# Force update the repository index (avoid cache causing incomplete list)
helm repo update --force-update harbor-repo
Notice: If Harbor version ≥2.6.0, make sure that ChartMuseum is deprecated and OCI compatibility mode is enabled.
1.2 Verify warehouse visibility
helm search repo harbor-repo # Charts list should be displayed
If the list is empty, check Harbor console → Target Project →Helm ChartsWhether Charts have been uploaded.
2. Automatic batch pull scripts
2.1 Core script logic
create, realize the following functions:
• Dynamically get Charts list
• Create a directory by Chart name
• Download all historical versions
#!/bin/bash
REPO_NAME="harbor-repo"
OUTPUT_DIR="./harbor-charts"
mkdir -p "${OUTPUT_DIR}"
# Get Charts list (filter the header)
CHARTS=$(helm search repo ${REPO_NAME} -l | awk 'NR>1 {print $1}')
for CHART in ${CHARTS}; do
SHORT_NAME=$(echo "${CHART}" | sed "s|${REPO_NAME}/||")
CHART_DIR="${OUTPUT_DIR}/${SHORT_NAME}"
mkdir -p "${CHART_DIR}"
# Get all versions (JSON parsing)
VERSIONS=$(helm search repo "${CHART}" --versions -o json | jq -r '.[].version')
# Parallel download (4 threads)
echo ${VERSIONS} | xargs -n1 -P4 -I{} helm pull "${CHART}" --version {} --destination "${CHART_DIR}"
done
echo "Download completed! Output directory: ${OUTPUT_DIR}"
2.2 Run the script
chmod +x && ./
Example output directory structure:
harbor-charts/
├── nginx/
│ ├── nginx-1.2.
│ └── nginx-4.5.
└── redis/
├── redis-7.0.
└── redis-8.1.
3. Frequently Asked Questions and Solutions
3.1 Permission Error (403 Forbidden)
• reason: Insufficient user permissions or expired authentication information
• repair:
# Reconfigure the repository credentials
helm repo update harbor-repo --username=${USERNAME} --password=${PASSWORD}
Make sure that Harbor users have at leastGuestRole.
3.2 Charts list incomplete
• reason: Helm cache or paging restrictions
• plan: Use the Harbor API instead to get it directly (example snippet):
# Pagination query repository list
PAGE=1
while true; do
REPO_PAGE=$(curl -s -k -u "${USERNAME}:${PASSWORD}" \
"${HARBOR_URL}/api/v2.0/projects/${PROJECT_ID}/repositories?page=${PAGE}&page_size=50")
# parse and append to the list...
done
For the complete script, please refer to the API call logic of web page 3.
3.3 Download interrupt or timeout
• optimization: Added retry mechanism
for VERSION in ${VERSIONS}; do
until helm pull "${CHART}" --version ${VERSION}; do
echo "Retrying ${VERSION}..."
sleep 10
done
done
4. Expand application scenarios
4.1 Pull the latest version only
Modify version acquisition logic:
VERSIONS=$(helm search repo "${CHART}" --versions -o json | jq -r '.[0].version')
4.2 Integration with CI/CD pipeline
Configure phase tasks in GitLab CI:
pull-charts:
stage: deploy
script:
- apk add helm jq
- ./
artifacts:
paths:
- harbor-charts/
4.3 Offline environment distribution
Package into compressed files and transfer:
tar czvf harbor-charts-$(date +%Y%m%d).tgz harbor-charts/
5. Summary
Through this solution, you can:
- Quick backup: Save Helm Charts in Harbor in full
- Simplify migration: Synchronize Charts across clusters or across Harbor instances
- Support offline deployment: Seamlessly connected with the Air-Gapped environment
It is recommended to combine scripts with timing tasks (such as CronJob) to achieve regular automated backups. For large-scale environments, you can refer to the API paging solution of Web Page 7 to optimize performance.