Program to Find Long Path Names

Does anyone know of a free or very cheap piece of Windows-compatible software to spider through a complex directory structure, either just producing a text list of all path/filenames or searching for path/filenames over a certain length?

We’re transitioning shared drive systems and IT will be using a copy system that will shit itself on files whose total name + path is over 255 characters and, well, my coworkers are very fond of extremely descriptive names. . . and given that our shared drive system spans some 20,000 files, I’d really rather not search them manually.

Any chance you know a bit of Python? This sounds very feasible in Python. Probably not the most helpful suggestion, since I don’t have the expertise to do it for you, but I’m sure if you google, you may find something workable. Maybe this?

I started a “Learn Python” guide and got about as far as If statements before interest dwindled. I used to be able to manage the level of complexity of a simple text-based adventure game in Java/C++, but that was many, many years ago and obviously didn’t have to navigate things like low-level OS integration.

Moreover, due to some previously bitched-about issues here at work, I’m not currently inclined to put more than a low amount of effort/thought into this process. Normally I’d be all for self-improvement to finish a task, but I just don’t have a lot of incentive to try very hard at this place right now :-/

God that sounds a lot worse now that I’ve typed it out.

There’s an easy way to list everything. Grab GNU findutils, then run this:

“<wherever you installed it>\find” <directory you care about> -regex .*

Example:
C:&gt;“C:\Program Files (x86)\GnuWin32\bin\find” C:\Users\aiuag81 ester -regex .*

C:\Users\aiuag81 ester
C:\Users\aiuag81 ester/anotherdir
C:\Users\aiuag81 ester/anotherdir/file.txt.txt
C:\Users\aiuag81 ester/anotherdir/longerfile.txt - Copy (2).txt
C:\Users\aiuag81 ester/anotherdir/longerfile.txt - Copy.txt
C:\Users\aiuag81 ester/anotherdir/longerfile.txt.txt
C:\Users\aiuag81 ester/file.txt.txt
C:\Users\aiuag81 ester/longerfile.txt - Copy (2).txt
C:\Users\aiuag81 ester/longerfile.txt - Copy.txt
C:\Users\aiuag81 ester/longerfile.txt.txt

In theory you could also figure out a regex that would force it to only match the long file names, but I’m not sure how to go about that one.

Edit: Actually with bash and grep in addition to find you could do it. Easiest way would be to install Cygwin to get all of those, then do:

find <your directory> 2>/dev/null | grep -E “.{255,}”

Probably far easier to pass that file list into your scripting language of choice than to write that regex. The regex might be more fun though. But maybe

…*

would work. If you don’t explode some maximum command line length.

Edit: Weird, I dunno where those spaces came from. In the edit window that’s just 256 .'s without spaces. But rereading the problem description I think you want 257.

Edit 2: Or “.{256,}” like Bob just added. Sneaky.

Installed Cygwin and all its components (near as I can tell), ended up making minor modificatiosn to the code:

C:\cygwin> find SHAREDDRIVE: 2>C:\Test | grep -E “.<255,>”

Receiving “access is denied”

had to change the /dev/null part because this is a Windows system without such a directory; I must made one there that was easy to test, since I wasn’t actually sure what 2> was doing.

You can also use a text editor that support regular expressions (Sublime Text, Notepad++, UE, etc). Capture your dir listing to a temp file in a command line window:

dir /b /s > temp.txt

Open the file and perform a search using reg expression, type “.{256,}” (without quotes), and have it select all lines.

If you’re looking to do some bulk renaming, there’s a great utility called Bulk Rename Utility that’s scary-looking but quite powerful.

That’s just redirecting error messages. Change it to something simple like err.txt. The access denied is probably because it’s trying to write text to your C:\Test directory.

find SHAREDDRIVE: 2>err.txt | grep -E “.{255,}”

Did you change the {} to <> on purpose? When I played around with it on my Linux box (no Cygwin install handy right now), it was {}.

The { to < error was a typo on my part when recording what I was doing (the default CMD font on this PC makes {s look a lot like <s).

I rejiggered the code to match your suggestion and now get “‘grep’ is not recognized as an internal or external command, operable program or batch file.”

Grep.exe is definitely under /cygwin/bin, so I moved to that directory before running. That seemed to get it working. I now receive a .txt file with the following:

“FIND: Parameter format not correct”


I also tried the “dir /b /s > temp.txt” command while sitting at the root of the shared drive. After about 10 minutes it spat out an 8 meg text file with some 81,000 lines to it which a regex search showed had no entries past 255 characters.

My concern here is that, AFAIK, the 255-character path limit is a Windows limit (we’re running a heavily modified version of WinXP here), so it’s possible native Windows command line tools might well be skipping, omitting, or just plain out not seeing any potential problem paths. Since there were definitely some paths in the .txt file in the 230-240 character range, you can understand my concern :(


Thanks to all who’ve jumped in on this one, by the way. It means a lot.

I used Microsoft’s Logparser for this last time I needed to do it…can’t remember the syntax offhand, but yea. It DOES work.

http://www.microsoft.com/en-us/download/details.aspx?id=24659

edit: aha!
logparser “select path from c:*.* where strlen(path)>260” -i:FS -recurse:-1 > c:\longpaths.txt

LogParser seems to agree with me more and it only found the same dozen or so files with 230-240-character names, so perhaps we are in the clear after all (I modified the search to hit anything over 250 initially and got nothing).

Interestingly, LP reports 104K elements were processed as compared to the 81K lines in the .txt file that “dir /b /s > temp.txt” produced. Maybe LogParser counts actual folders as elements, while the dir /b /s command only notes actual files within folders? Perhaps so.

The find error is probably due to Cygwin not playing nice with Windows drive mappings. Something like this would probably work (assuming your shared drive is E:):

find /cygdrive/e 2>err.txt | grep -E “.{255,}”

But it looks like you’ve got a more Windows-based method to work, so all good.

Well, I can do it in unix, so that means you could use Cygwin to do it under Windows.

Something like “ls -R” (recursive) piped into, uh, uh, there must be a simple filter that works just on input line length. Well, I don’t know what that filter is offhand, but you could use sed or awk with a pattern that matches lines longer than your cutoff length. It would take longer to download and install cygwin than to run the command.

I think this would work in awk, but it seems awkward (heh), and there is probably something simpler:

ls -R | awk ‘{ if (length($0) > 255) print }’

This would find all files greater than 255 characters starting from the working directory. But no doubt there is a windows native utility that will do what you want too :)

Oh man, points for creativity guys but the proper tool in this case is powershell:
Get-childitem -Recurse “D:\Pictures” | foreach-object -process { $.FullName } | Sort-Object length
Which just shows everything sorted by the length of the full path.
Get-childitem -Recurse “D:\Pictures” | foreach-object -process { if ($
.FullName.length -ge 255) {$_.Fullname}} | Sort-Object length
And this one shows everything greater than or equal to 255 characters for the full path.

That’s pretty smooth. Good to know for future reference.