Jump to content
cclarry

Waves Sampler Library Updated

Recommended Posts

This sucks.  I have a folder called Acid Media that all loops are in folders by developer.  It wont scan the whole thing. I have to choose folders inside this directory.

Share this post


Link to post
Share on other sites
1 hour ago, kitekrazy said:

This sucks.  I have a folder called Acid Media that all loops are in folders by developer.  It wont scan the whole thing. I have to choose folders inside this directory.

Lame hack for a lame problem, but you could flatten the directory tree without duplicating data by creating hard links:

(PowerShell at the root of the directory tree)

foreach ($file in get-childitem . -recurse -include *.ext) { $path = $file.fullname ; new-item -itemtype hardlink -value $path -path $path.replace("$pwd\", '').replace('\', '-') }

It recurses down from the directory it's run in, finds all files ending in .ext and creates hard links to them in current directory, naming the new links as the relative path (from the tree root) including filename and replaces backslashes with dashes. As the links' targets are inodes in the filesystem they are absolute and the links can be placed anywhere. You can delete the links and it won't delete the files before all their hard links are deleted.

Replace .ext with whatever file extension the loops use, or if there are multiple extensions you can separate them with commads, e.g.: include *.wav,*.mp3

You can also change the formatting of the string, like change dashes to something else (the last '-'), but including part of path in the filename is probably a good idea to avoid filename clashes.

Absolutely no warranty. 😄

 

 

Share this post


Link to post
Share on other sites
54 minutes ago, sarine said:

Lame hack for a lame problem, but you could flatten the directory tree without duplicating data by creating hard links:

(PowerShell at the root of the directory tree)

foreach ($file in get-childitem . -recurse -include *.ext) { $path = $file.fullname ; new-item -itemtype hardlink -value $path -path $path.replace("$pwd\", '').replace('\', '-') }

It recurses down from the directory it's run in, finds all files ending in .ext and creates hard links to them in current directory, naming the new links as the relative path (from the tree root) including filename and replaces backslashes with dashes. As the links' targets are inodes in the filesystem they are absolute and the links can be placed anywhere. You can delete the links and it won't delete the files before all their hard links are deleted.

Replace .ext with whatever file extension the loops use, or if there are multiple extensions you can separate them with commads, e.g.: include *.wav,*.mp3

You can also change the formatting of the string, like change dashes to something else (the last '-'), but including part of path in the filename is probably a good idea to avoid filename clashes.

Absolutely no warranty. 😄

 

 

Wow  that's too much of a rocket science for me.

  • Haha 1

Share this post


Link to post
Share on other sites
1 hour ago, Paul Young said:

Wow  that's too much of a rocket science for me.

The "files" you see in File Explorer, they are actually hard links. What they link to is a data structure (inode) in the file system, an abstraction the OS uses to manage storage, that is used to find the actual data (and some metadata). When you "move" a file from one folder to another on the same volume, it doesn't actually move the files (the actual data) around on the physical storage, just the links. The OS (or FS driver) does reference counting on the inodes, e.g. increments or decrements the count when a hard link is created or deleted, and when it reaches zero the inode is removed from the FS table and the physical storage space denoted by it flagged as free space. You can create many such links that point to the same file, regardless of their name or location in the filesystem. The new link is not a reference to the original link (which we normally call "the" file), rather it takes its reference from it, and the two are equal and independent of each other. The actual file only gets scrapped by the OS when both are deleted. There are some gotchas, such as that file indexing/caching may depend on the reference used rather than the FS table entry, and probably a few ways to shoot yourself in the foot if you try to be too clever.

tl;dr for programmers: It's like copying a pointer to garbage-collectable memory.

Share this post


Link to post
Share on other sites

Please sign in to comment

You will be able to leave a comment after signing in



Sign In Now

×
×
  • Create New...