Grandalf

Bardzo aktywny
Członek Załogi
Moderator
Dołączył
26 Maj 2015
Posty
19209
Reakcje/Polubienia
55912
QuickHash GUI to otwarty interfejs graficzny dla Linuxa, Windowsa i Apple Mac OSX, który umożliwia łatwe i szybkie haszowanie danych: tekstu, plików tekstowych wiersz po wierszu, plików binarnych, porównania plików, porównania folderów, dysków i woluminów dysków (jak administrator), danych Base64, a także pozwala na kopiowanie plików w jednym folderze do innego przy użyciu haszowania danych po obu stronach w celu porównania i integralności danych.

Program pierwotnie został zaprojektowany dla Linuksa, ale obecnie jest również dostępny dla Windows i Apple Mac OSX. Dostępne są algorytmy skrótu MD5, SHA1, SHA256, SHA512 i xxHash.
źródło:
Zaloguj lub Zarejestruj się aby zobaczyć!


QuickHash-for-Windows.jpg

Zaloguj lub Zarejestruj się aby zobaczyć!


Pobieranie:
Zaloguj lub Zarejestruj się aby zobaczyć!
 

Grandalf

Bardzo aktywny
Członek Załogi
Moderator
Dołączył
26 Maj 2015
Posty
19209
Reakcje/Polubienia
55912
QuickHash GUI 3.1.0
Version 3.1.0

Update : HashLib4Pascal library updated to master version available as of 18th July 2019.
New : Added SHA-3 (256) hash algorithm
New : Added Blake2B (256) hash algorithm (best on 64-bit systems, faster than MD5, SHA-1, SHA-3, SHA256 and SHA512 and more reliable than MD5, SHA-1 and comparable to SHA-3) :
Zaloguj lub Zarejestruj się aby zobaczyć!

New : The FileS tab right-click menu now includes 'Copy all hashes' option, to clipboard ALL the hash values in the hash column. If > 10K values, will ask user if he wants to write to a file instead.
Fix : The "Compare Two Files" tab had a bug. If the user clicked the resulting hash, it would be copied to clipboard correctly but be described as MD5 even if the chosen algorithm was not MD5. Fixed.
Fix : In all tabs, xxHash in 64-bit mode did not show a progress bar. The 32-bit version did though. That discrepency was fixed.
Fix : When comparing two folders, if there was a count mis-match, the log showed the same name for both folders, instead of "Folder A and Folder B", it said "Folder A and Folder A". Fixed.
Fix : In the FileS tab, when exporting BILLIONS of files an out of memory limit was reached. This should now be fixed due to implementation of a file write stream instead of using CSVExport library.
Fix : Theoretical compliance to Apple OSX Catalina 64-bit enforcement.
Pobieranie:
Zaloguj lub Zarejestruj się aby zobaczyć!
 

Camel1965

Bardzo aktywny
Zasłużony
Dołączył
8 Wrzesień 2010
Posty
37840
Reakcje/Polubienia
33962
QuickHash GUI 3.3.0
v3.3.0 May 2021
New : Ability to hash forensic images of the Expert Witness Format (EWF), also known as "E01 Images". Available for Windows and Linux for users who know what they are doing with regard to forensic images. It is not available for OSX, for now. Quickhash will conduct the hash and also report the embedded MD5 or SHA1 hash, if available, placing it in the "Expected Hash" field automatically, depending on the type of hash the user is performing. So if the E01 contains both MD5 and SHA1, but the user selects SHA1, then the embedded SHA1 hash will be reported as well as the computed SHA1 hash, and the same theory for MD5. More features will likely follow in future of this landmark addition to QuickHash GUI.
New : CRC32 algorithm added for Text, File, FileS, Compare Two Files, Compare Two Folders and Base64. Not added to disks, and not available for EWF (E01) image hashing. New : Users who utilise the CRC32 algorithm via the FileS tab can now optionally choose whether to just compute the checksums of the files in the folder as normal, or, compute the checksums and then rename the files by appending the checksum in square brackets to the end. Useful for many media and sound specilists who commonly use CRC32 values in their work.
New : Button added to enable the user to easily make a copy of the backend SQLite database at any given point in time, for convenience. This can help users who may wish to load it into specific database tools, like SQLite Explorer or browser extensions like SQLite Manager.
New : The About menu now contains a "Check Environment" option (available on all OS platforms but the results vary on each) that scans for DLLs, reports database information etc. New : Logo replaced with the newer Quickash logo.
New : In some parts of QH where display grids of data are generated (FileS, Copy, Compare Two Folders), the user can now select their own delimiter character via a drop down menu, such as the tab character, hyphen and (heaven forbid) even the space char. If no character is chosen, a comma is assumed and used as before.
New : The user can now use the About menu to establish the version of SQLite that is being used by QuickHash.
Improvement : Monumentally large changes to "Compare Two Folders" processing, scrubbing away much of the earlier effort and restructuring it, with big thanks to an open-source co-developer who has helped me here. Key amongst them are that v3.3.0 addresses a bug where rows got mis-aligned if the file counts differed. The mis-match was still correctly reported in v3.2.0, and even if the file matches counted but the hashes differed, that was also still OK. But, the rows got out of sync if the file counts differed due to there being less files in one folder than the other and my use of the UPDATE SQL statement. Additional restructuring applied but note that C2F is really designed to check that two folders are a mirror image of each other, and it is supposed to help you confirm this is the case, rather than help you clean up your disk. If your aim is to use it as a file manager, then QH might not be the best option. Other tools like Beyond Compare might be better for your needs here.
That said, the ability now exists to compare files by name and hash value in both folders, and then the user can right click the results and see many other options too :
~ Restore Results view
~ Clipboard all rows
~ Clipboard selected row
~ Clipboard all selected rows (currently does it in reverse order for some reason)
~ Show mismatches (new, based on filename or hash or both)
~ Show duplicates (new, offers the chance to clipboard immediately after because the column row changes for this display)
~ Show matching hashes (new)
~ Show different hashes, not missing files (new)
~ Show missing FolderA files (new)
~ Show missing FolderB files (new)
~ Show missing files from Folder A or FolderB (new)
~ Save as CSV file
~ Save as HTML file
That is a whopping array of ways to conduct some analysis of two folders and is about as good as I think I can make it and is based on help from the community. If that still falls short, other tools are available, or get stuck in yourself and help me.
Improvement : DB Rows were being counted (when required) using a slower method than I had realised. With v3.3.0, counts are now immediate by calling DBGrid.DataSource.DataSet.RecordCount;
Improvement : Column headers added to CSV and HTML outputs (achieved by right clicking the display grid results throughout). I may have missed one but I think I have them all covered
Improvement : Removed the generation of a "QH_XXXXX" time stamp named parent folder in destination folder when copying as many users reported this was unhelpful.
Improvement : SQLite DLL's for Windows replaced with stable version 3.35.5.0 as of April 2021 (replacing former version 3.21.0.0).
Improvement : The size of some fields in SQLite was set to 32767 to account for crazily large filename and filepath combinations. But on reflection, that seems extreme use of memory for what must be one in a billion chances and very unlikely to be encountered. Instead, 4096 size is set in v3.3.0 to still enable QH to account for very long paths, but given that filenames alone can rarely exceed 255 (even where paths can) on any of the 3 OSes except for some UTF8 and UTF16 variances, where even with those the maximum is still 1020 bytes (4 bytes for every single char of the 255 max)
Improvement : Disk hashing module now presents more data in the list view, especially for logical volumes, such as the filesystem.
Improvement : The button to launch the disk hashing module now gives the user an indication of what it is doing while it loads the treeview of disks and volumes
Improvement : The system RAM label has been moved from the main interface to the new Environment Checker section of the About menu (Windows only). This frees up some GUI real-estate and avoids the use of resources unnecessarily. Fix : DisableControls and EnableControls used more extensively to expedite the "Save as CSV" and "Save as HTML" options for large volumes of data, as some user reported save efforts taking several hours for millions of rows of data. This makes sense because Quickhash was repaitning the display grid after each row file write.
Fix : When saving results as CSV in Compare Two Folders, if the user selected an existing file to overwrite, it would do that, but the next run would result in an infinite loop telling the user it already exists and to choose another file, but not being able to actually do so. That was fixed.
Fix : Apples new OSX 'Big Sur' OS unhelpfully removed static libraries, like the SQLite library, so it could not be referenced by file path. So a different method of lookup is needed using the dyanmic linker cache and a 3 state compiler directive is now used for loading SQLite, depending on the OS being used. That has been applied so that Apple users can continue to enjoy the benefits of QuickHash on that most changing and challenging of operating system. You're welcome.
Fix : Two stringlists are created when using "Compare Two Folders" to store the list of files for analysis. I had introduced a memory leak here without realising it and that has been corrected (with thanks to an open-source developer who spotted that for me).
Fix : A small memory leak existed in frmSQLiteDBases.DatasetToClipBoard for copying data to clipboard. The CSVClipboardList string list that was used to achieve this was not being freed. Now it is freed.
Fix : In the basic results txt file that is created during Compare Two Folders, the selected folder names in the log file were prefixed with the LongPathOverride of two slashes a question mark and a slash. That was corrected to just show the normal path as users dont realy need to see that (as it is just an API switch).
Fix : .Value was used extensively to "call" a value from a DB cell. But some cells can be NULL in QuickHash. And if they are, using .Value can generate an error. Instead this is now switched to .AsString meaning a NULL value returns an empty string, as intended.
Fix : In the Text tab, the "Expected Hash" lookup was not applied for xxHash, which was missed before so if users pasted an expected xxHash value, it would not be looked up against the computed hash. That was fixed.
Fix : In the File tab, the "Expected Hash" lookup was not applied for xxHash, which was missed before so if users pasted an expected xxHash value, it would not be looked up against the computed hash. That was fixed.
Fix : The disk hashing module showed the field for Blake after hashing, even if empty and not computed, and was not being hidden like the others. That was fixed.
Fix : The disk hashing module reported "Windows 8" when conducted using "Windows 10". This was not actually wrong, but mis-leading, and is actually due to the Windows API being woeful in parts with regard to how the "number" and "name" of Windows are reported. So a new function created to speak to ntldr.dll directly so that now the major, minor, and build versions are all reported.
Code : Adjusted variable naming in the "ProcessDir" function relating to source and destination folders because it was so confusing I did not even understand it several years after first writing it.
Code : More effort made to initialise variables
Code : Dismodule code entirely refactored to be more efficient, to produce more useful data for the user, and to help safeguard against null values, removable drive bays with no disks, and for general ease of reading. It should now also read (be able to hash) CD and DVD disks, for example.


v3.2.0
New : Blake3 hash algorithm added for text strings, a file, Files recursively, Compare Two Folders and Compare Two Files.
New : Blake3 hash algorithm added to disk hashing module
Fix : Hashing of physical disks in Linux via the "Hash Disk" module is re-enabled after "Access Violations" reported for earlier versions.
Fix : In the "Compare Two Folders" tab, if the "Log Results" was unticked, it generated an access violation. That was fixed.
Fix : In the "Copy" tab, when the results are shown in the display grid, the navigation buttons were not clickable. That was fixed.
Fix : In the "FileS" tab, when the results are shown in the display grid, the navigation buttons were not clickable either. That was fixed.
Fix : "Time Taken" in the "File" tab was showing a 24hr clock instead of showing as the number of seconds elapsed, as intended. That was fixed by utilising GetTickCount.
Fix : In the "Compare Two Files" tab, the "Result: Match" value was only showing if the timed scheduler was invoked (though the actual hashes were still being displayed). This was caused due to a loop error where the result only displayed inside the scheduler loop. This has been fixed by moving it out of the loop, so that the result is shown either immediately with no scheduler being used, or following the scheduled invoke.
Update : "LCL Scaling" is now incorporated which will hopefully better enable the GUI display on variously sized resolution settings. User feedback will confirm in due course.
New : In the 'Compare Two Folders' tab, users have asked for a grid view of the files compared rather than just a text file output. That has been added with many of usual right click menu options such as copy to clipboard, save as HTML, etc.
Update : In the 'Compare Two Folders' tab, the option "Cont. if count differs?" has now been removed for several reasons:


  1. Users were frustrated at analysing often millions of files only to realise after the event that in order to continue if the file count differed they had to do it all over again with the option checked
  2. More users than I anticipated when I first added that feature use the "Compare Two Folders" comparison to not check that both folders do indeed match but in fact to determine in what way they differ.
  3. It is more thorough, allbeit slower, to itteratively check the hashes of both folders both ways rather than checking based merely on count.
    Fix : In the 'Compare Two Folders' tab, if the user selects "Log Results" (which is enabled by default), then in Linux and OSX, the text log file was only being populated with the hash values and not the filenames too. That was fixed.
    Important Fix : In the 'Compare Two Folders' tab, the comparison was not sufficiently two way, meaning that if Folder 2 matched Folder 1 it would report a match, but if Folder 1 did not match Folder 2, it might still report a match when it should report a mismatch. THis has now been modified to a 3-way comparison. First it checks the file count, then is compares both hash lists against each other; HashListA against HashListB, then HashListB against HashListA. It has obviously made the comparison slightly slower for millions of files, but hopefully not too significantly, and accuracy is more critical than speed.

Here is a copy of the bug report, provided for full transparancy :


"The folders were set up like this:
Folder 1: File A, File A (copy)
Folder 2: File A, File B


When running the compare feature, selecting Folder 1 and then Folder 2, the tool reported a match, "The files in both folders are the same. MATCH!" I had expected a mis-match. Yes, all of the files in Folder 1 are in Folder 2, but not all of the files in Folder 2 are in Folder 1. If I re-ran the compare feature in reverse, selecting Folder 2 and then Folder 1, a mis-match was reported, "The files of both folders are NOT the same. The file count is the same, but file hashes differ. MIS-MATCH!" This seemed odd because if I ran this features with the following situation:
Folder 1: File A
Folder 2: File A, File B
I would receive a mis-match message, but the situation is basically the same, all Folder 1 files are within Folder 2, just like if two copies of File A were in Folder 1, which reported a match.
I had expected a back and forth comparison, but it appears to be a one-way comparison. "


I thank the reportee who brought this to my attention and it should now be resolved in v3.2.0.
Zaloguj lub Zarejestruj się aby zobaczyć!
 

Camel1965

Bardzo aktywny
Zasłużony
Dołączył
8 Wrzesień 2010
Posty
37840
Reakcje/Polubienia
33962

QuickHash 3.3.1​

What's new in QuickHash 3.3.1

January 7, 2022
  • Function CountGridRows has been changed. This function was designed to count the number of rows in any given display grid to determine whether the clipboard could be used, or whether the data would saved to a filestream. And, if the user chose to save the output to CSV or HTML, the same function would check to see if a memory list of strings could be used to then be saved out to a file, or whether a filestream should be used line by line.
  • But, when saving the output of very large lists of files to HTML, filestreams were supposed to be incorporated rather than using RAM. However, due to the v3.3.0 adjustment of the function CountGridRows to use .RecordCount, .First and .Last, the variable that was used to check the number of rows was only showing what was on screen instead of what was in the table. So, QH was still using RAM even if the row count was many hundreds of thousands!! As such, it would cause QH to crash with large volumes of data. Fixing this required tow significant changes:
  • Changes to CountGridRows means a dedicated TSQLQuery is used on the fly, instead of the DBGrid itself.
  • Changes to the function call of CountGridRows now means the Grid and the table to query is passed.
  • Major changes to functions SaveFILESTabToHTML SaveCOPYWindowToHTML SaveC2FWindowToHTML to use TSQLQueries too, instead of DBGrid queries. All three can now handle many thousands of rows more easily and are executed in just a few seconds. A test of 407K rows was saved as a 56Mb HTML file in under 10 seconds. However, I have noticed that the step of preparing the data for display in the Compare Two Folders does take a long time for many tens of thousands of files. It gets there eventually, but it can take a while. This is due to the enormous SQL statement that was added in v3.3.0 in the repareData_COMPARE_TWO_FOLDERS function. This was added to give users greater abilities to find and sort, following earlier pre v3.3.0 complaints that the compairson was not granular enough. It is more granular now, but has come at the cost of taking longer to prepare. Something to work on for v3.3.2.
  • These changes described above are the largest service release aspects to this version.
  • The user is now also shown a message on screen, with an OK button, to let them know a Save as HTML has finished. Useful if the data set is very large and the save takes some time.
  • The HTML file produced by right clicking in the FileS tab did not have a row 1 header if the row count was over 20K. Now it does.
  • The HTML file produced by right clicking in the FileS tab did not have the FileSize column if the row count was over 20K. Now it does.
  • The HTML file produced by right clicking in the FileS tab did not have the ID column if the row count was LESS than 20K. Now it does. (note that this has not been added for clipboard output on the assumption it would be pasted into spreadsheets where rows are automatically then counted)
  • (See - there were a lot of things missing in the HTML save for large volumes of data that I had missed - this is how small scale testing on your own does not compare with real world usage - its only when users report issues to me that I often get to know about problems, and then in turn, that unearths other issues that I can then fix)
  • On Linux, and OSX, the "Curently Hashing" status in the FileS tab was chopping off the first characters of the path. So instead of it saying /home/user/Documents/MyFile.doc it was saying e/users/Documents/MyFile.doc. This was due to the long path override character cleansing that is necessary for Windows, but not for Linux or OSX, and I forgot to use a cross compiler directive. Now fixed in in v3.3.1
  • The function DatasetToClipBoardFILES checked if the number of rows was less than 20K, but didn't show a message to instruct the user to use a file save if the count was greater than 20K. That has now been applied in v3.3.1, so they dont just sit there wondering what has happened.
  • If the user tried to clipboard a volume of data over 20K rows in the FileS tab, although the user was told to use a file save instead, the status still said it was copying to clipboard. Now it will tell the user the clipboard effort has been aborted.
  • The Clipboard button in the "Copy" display grid was not as complete as the right-click clipboard option. A remnant of the changes made in v3.3.0. I think. Now both methods produce the same clipboard content.
Zaloguj lub Zarejestruj się aby zobaczyć!
 
Do góry