'Compare large Files on Network Storage

I'm comming from testing this sample code here: https://devblogs.microsoft.com/scripting/use-powershell-to-compare-two-files/

I'm writing a script i need to copy a large DB-Backup-File (>120GB) from a local Network drive to my local drive for some processing. - The thing is: The file doesn't change very ofthen. (its a Backup of one of our customers who occasionally updates it.) So i want to skip the whole "copy-process" in case it didn't change.

Now, the problem with the skript in the blog is: it still takes about as much time as copying the file without checking. I assume thats because PS needs to load the whole file to generate the hash, but I'm not sure.

anyway: I'm looking for performant alternatives. Things like "just compare the filesize" (would that still work if you rename the file? - it's should, shouldn't it?) or something else in that direction.

p.s. the whole environment is on Windows. In case that matters.

p.p.s. here is a copy of the code I tested. So you don't have to search the Blog for it:

$fileA = "C:\fso\myfile.txt"

$fileB = "C:\fso\CopyOfmyfile.txt"

$fileC = "C:\fso\changedMyFile.txt"

if((Get-FileHash $fileA).hash  -ne (Get-FileHash $fileC).hash)

 {"files are different"}

Else {"Files are the same"}


Sources

This article follows the attribution requirements of Stack Overflow and is licensed under CC BY-SA 3.0.

Source: Stack Overflow

Solution Source