进阶用法
自动化与脚本
定时任务和批量脚本
🤖 自动化与脚本
批量下载(URL 列表)
# 从文件读取 URL 列表
gallery-dl -i urls.txt
# urls.txt 每行一个 URL增量下载脚本
结合归档和定时任务,实现自动增量下载:
#!/bin/bash
# auto-download.sh
URLS=(
"https://twitter.com/user1"
"https://twitter.com/user2"
"https://www.instagram.com/user3/"
)
for url in "${URLS[@]}"; do
gallery-dl --download-archive archive.sqlite3 \
-c config.json "$url"
sleep 10 # 间隔
done添加到 crontab:
# 每天凌晨 3 点执行
0 3 * * * /path/to/auto-download.sh# auto-download.ps1
$urls = @(
"https://twitter.com/user1",
"https://twitter.com/user2",
"https://www.instagram.com/user3/"
)
foreach ($url in $urls) {
gallery-dl --download-archive archive.sqlite3 `
-c config.json $url
Start-Sleep -Seconds 10
}添加到任务计划程序:
# 创建每日定时任务
$action = New-ScheduledTaskAction -Execute "powershell" `
-Argument "-File C:\scripts\auto-download.ps1"
$trigger = New-ScheduledTaskTrigger -Daily -At "3:00AM"
Register-ScheduledTask -TaskName "gallery-dl-auto" `
-Action $action -Trigger $triggerPython 脚本调用
import gallery_dl
# 基础用法
gallery_dl.main(["URL"])
# 带配置
gallery_dl.main([
"--config", "config.json",
"--download-archive", "archive.sqlite3",
"URL"
])错误处理与日志
{
"output": {
"log": {
"level": "info",
"format": "[{name}] {message}",
"logfile": {
"path": "./gallery-dl.log",
"mode": "a",
"level": "debug"
}
}
}
}配合 --download-archive 使用时,每次运行只会下载新增内容,非常适合定期抓取。