help-circle
rss
Download playlist with youtube-dl. Saving all data including the archive file into the uploader's folder for each video
This can't be done just with arguments as it says here: [Set download-archive path with args](https://reddit.com/r/youtubedl/comments/muysm1/set_downloadarchive_path_with_args/) I want to make two similar scripts. One that downloads the videos and archive file in the uploader's folder for each video. And another script that also downloads the metadata. I want to check the archive before downloading. I want to be able to use the function multiple times in parallel putting them on the background. The only way I can think of getting the uploader is to download the metadata with the output `%(uploader)s`. Then create a folder with the uploader's name, remove the file, and download again the files with the correct name from the created folder. I have [a few functions][1] defined already that may help with this. This is what I've tried but it isn't working # Video Playlist saving archive file to uploader's folder ytp() { # Dl opts=( ${opts[@]} --skownload metadata --skip-download --write-info-json ) # Get ten random characters local rand=$( cat /dev/urandom | tr -dc 'a-zA-Z0-9' | fold -w 10 | head -n 1 ) LOG_NAME=$( get_log_name "$(${paste[@]})" ) $yt_dl ${opts[@]} --output "%(uploader|Anon-%(id)s)s.%(ext)s" "$(${paste[@]})" >> "/tmp/$LOG_NAME" 2>&1 for f in *.info.json; do local name="${f%.*}" if [[ $name == Anon* ]]; then name="Anon" fi mkdir -p "$name" rm "$f" cd "$name" || exit # Download media ytv cd .. || exit done } [1]: https://git.disroot.org/hirrolot19/dotfiles/src/branch/main/config/zsh/plugins/yt-dl/yt-dl.plugin.zsh

[SOLVED] [yt-dlp] command not found: `xsel -ob`
I'm using Manjaro Linux. [I've changed my yt-dlp's zsh configuration][1] and now I get ❯ ytvp deal_with_long_filename:1: command not found: xsel -ob deal_with_long_filename:2: command not found: xsel -ob The log shows Usage: yt-dlp [OPTIONS] URL [URL...] yt-dlp: error: no such option: --continue --no-overwrites --no-post-overwrites --verbose --restrict-filenames --retry-sleep fragment:exp Usage: yt-dlp [OPTIONS] URL [URL...] yt-dlp: error: no such option: --continue --no-overwrites --no-post-overwrites --verbose --restrict-filenames --retry-sleep fragment:exp Why is it treating all options as a single one? I've tried running the `xsel -ob` command on its own and it works fine. How do I fix this? I would like to keep the send to background `&` option that I was using. Would it give problems with the definition of the function `deal_with_long_filename`? This is my configuration now opts="--continue --no-overwrites --no-post-overwrites --verbose --restrict-filenames --retry-sleep fragment:exp=2:64 --print-to-file" if [ -f /usr/local/bin/youtube-dl ]; then yt_dlp="/usr/local/bin/yt-dlp" else yt_dlp="$(which yt-dlp)" fi # If using Mac if [[ "$(uname -a | awk '{print $1}')" == "Darwin" ]]; then paste="pbpaste" opts="${opts} --ffmpeg-location /usr/local/bin/ffmpeg" else # If using Linux paste="xsel -ob" fi sanitize_linux_filename() { echo "$1" | sed -e 's/[^a-zA-Z0-9._-]/_/g' } get_log_name() { TIMESTAMP=$( date +%y%m%d%H%M%S ) NAME=$( sanitize_linux_filename "$1" ) echo "yt-dlp_${TIMESTAMP}_${NAME}.log" } deal_with_long_filename() { if ! $yt_dlp $opts --output "%(upload_date>%Y-%m-%d|)s%(upload_date& |)s%(title)s-%(id)s.%(ext)s" "$($paste)" >> "/tmp/$LOG_NAME" 2>&1; then $yt_dlp $opts --output "%(upload_date>%Y-%m-%d|)s%(upload_date& |)%(webpage_url_domain)s-%(id)s.%(ext)s" "$($paste)" >> "/tmp/$LOG_NAME" 2>&1 fi } # Video Playlist ytvp() { LOG_NAME=$( get_log_name "$1" ) opts="${opts} --format '(bv+(wa[abr>=64]/ba))/b' --format-sort res:720,tbr~2000 --no-playlist --download-archive 'downloaded.txt'" deal_with_long_filename "$1" "$LOG_NAME" } [1]: https://git.disroot.org/hirrolot19/dotfiles/commit/aefb950b3a6f5ddcb6504d093ef96bfca6013ead#diff-6a0a53cfa400a5b85f7791056115eb86a4e7c783

psgrep alias to show process id and command
I want to rewrite this command so that it shows the command name and the arguments without having to specify all those options to `awk` function psgrep() # show process id and command with arguments ps aux | grep "${1:-.}" | grep -v grep | awk '{print $2, $11, $12, $13, $14, $15, $16, $17, $18, $19, $20}' }

Send every process to background and disown, in zsh
In zsh, I want to send every process to background so that I can continue typing in the terminal if a process takes longer than I expected. Instead of having to write `&` at the end of every command or having to do `Ctrl + Z` to freeze it and `bg` to send it to background. Also I want the process to continue if I close the terminal, so I want to disown it. Can this be done by default with some zsh setting?

[SOLVED] How to find files with same relative paths in multiple folders?
I want to make a virtual file system from a few folders and want to check if there are any conflicting files. So I want to provide a few folders and get files with the same path relative to their folders. How can I find the conflicts? ```bash #!/usr/bin/env bash # Find conflicting paths in multiple folders to use with mergerfs # Usage: ./conflict.sh /path/to/folder1 /path/to/folder2 /path/to/folder3 > conflicts.txt find_conflicting_files() { local folders=("$@") local files=() if [[ ${#folders[@]} -lt 2 ]]; then echo "Please provide at least 2 folders" exit 1 fi for folder in "${folders[@]}"; do files+=("$(find "$folder" -type f)") done # find all conflicting files # print the conflicting files for conflict in "${conflicts[@]}"; do echo "$conflict" done ```

Ask specific questions about how to code something in sh, bash, zsh, etc

General bash discussion on lemmy.ml

Create Post From:
lemmy.ml

  • 0 users online
  • 1 user / day
  • 1 user / week
  • 1 user / month
  • 4 users / 6 months
  • 1 subscriber
  • 5 Posts
  • 5 Comments
  • Modlog
Heap Overflow
A place to ask programming questions and share free resources

General programming discussion,
Additional resources, challenges

To post or comment:

  1. Create an account on another lemmy instance
  2. Then search for the community url like here

RULES:

  1. No politics
  2. No flaming / trolling
  3. No proprietary BS
  4. Stay on topic

Please keep questions & examples short!

All content is Public Domain unless otherwise specified.
Only CC0, CC BY 3.0, or CC BY-SA 3.0 alternate licenses are allowed.

No affiliation with StackOverflow.com