DevToolNow

Hash Algorithms Compared: MD5, SHA-1, SHA-256, bcrypt, and Argon2

DevToolNow Editorial Team···10 min read

"Hash function" covers two very different jobs that share a name. One is making a fingerprint — fixed-size, fast, deterministic — for indexing, caching, and integrity checks. The other is making something an attacker cannot reverse, even with a stolen database and a year of GPU time. Treating the two as the same algorithm choice is how breaches happen. This guide separates them, walks the algorithms you actually encounter, and ends with a decision tree.

1. Cryptographic vs non-cryptographic hashes

A non-cryptographic hash like FNV, MurmurHash, or xxHash optimises for one thing: speed. It produces well-distributed output for use as hash table keys, bloom filter inputs, or rolling checksums. It makes no claims about resistance to a deliberate adversary trying to find collisions or preimages.

A cryptographic hash adds three properties: preimage resistance (given a hash, you cannot find an input that produces it), second preimage resistance (given an input, you cannot find a different input with the same hash), and collision resistance (you cannot find any two inputs with the same hash). NIST SP 800-107 and FIPS 180-4 are the canonical specifications. When any of these properties breaks for a real algorithm, that algorithm is no longer safe for the use case that depends on it.

2. MD5: broken since 2004, still useful

MD5 produces a 128-bit digest. It was the default hash function of the 1990s and was the first widely used algorithm to fall to collision attacks. Wang and Yu's 2004 attack reduced collision-finding from theoretical to practical, and by 2008 chosen-prefix collisions enabled real attacks against certificate authorities (the rogue CA paper by Sotirov, Stevens, et al.).

What MD5 still does well: producing a short, fast fingerprint of file content for non-adversarial settings. Object stores use it as an ETag, cache layers use it as a key, and ETLs use it to spot whether a row changed. The rule is simple: if an attacker has anything to gain from finding a collision — a forged signature, an authentication bypass, an integrity check that fails open — MD5 is the wrong choice.

3. SHA-1: collision attack 2017, deprecated for signatures

SHA-1 produces a 160-bit digest and was specified in FIPS 180-1 in 1995. Theoretical attacks chipped away at it through the 2000s, and the 2017 SHAttered paper demonstrated a practical chosen-prefix collision: two PDF files with identical SHA-1 hashes but different content, generated with roughly 6,500 CPU-years of compute (which Google rented). All major browsers stopped trusting SHA-1 TLS certificates in 2017, and NIST disallowed SHA-1 for digital signatures by the end of 2030 — but for new code, it is already over.

Git is the most visible holdout, and the project is mid-migration to SHA-256. For everything else, treat SHA-1 the way you treat MD5: fine as a non-security checksum, never as a security primitive in new code.

4. SHA-256 and the SHA-2 family: the current standard

SHA-256 is the workhorse of the SHA-2 family (FIPS 180-4). It produces a 256-bit digest, is unbroken in 2026, and is the default cryptographic hash you should reach for unless you have a specific reason not to. It is what TLS certificates are signed with, what JWT tokens use under HS256/RS256/ES256, and what blockchains use for proof-of-work and Merkle trees.

SHA-3 (Keccak, FIPS 202, standardised in 2015) is also unbroken and uses a fundamentally different construction (a sponge rather than a Merkle–Damgård chain). It exists primarily as a hedge against a hypothetical break in SHA-2; in practice, SHA-256 remains the default in new protocols. BLAKE2 and BLAKE3 are faster modern alternatives with strong cryptographic properties — useful when you specifically need speed (content-addressed storage, integrity checks at line rate) and your protocol is not constrained to FIPS-approved primitives.

5. Why password hashing is a different problem

The properties that make SHA-256 a great general-purpose hash — fast, deterministic, well-distributed — are exactly the properties that make it disastrous for storing passwords. If your user database leaks and the passwords are SHA-256, an attacker with commodity GPUs can try billions of candidate passwords per second against the leaked hashes, and standard breach lists will find nearly every weak password in the database within minutes.

A password hashing function fixes this by being deliberately slow and, ideally, memory-hard. It also requires a per-user salt — random data mixed into the input — so that two users with the same password produce different hashes. Salts defeat precomputed rainbow tables and force the attacker to attack each user independently. Modern password hashing libraries handle salt generation for you and store the salt as part of the encoded hash output.

6. bcrypt, scrypt, and Argon2

bcrypt (Provos & Mazières, 1999) was the first widely deployed password-specific hash. It uses a cost parameter — the work factor — that exponentially increases the time taken. It has been battle-tested for two decades and has libraries in every language. Its limitation is that it is not strongly memory-hard, which means an attacker with custom ASICs can brute-force it more efficiently than legitimate hardware can verify.

scrypt (Percival, 2009; RFC 7914) introduced memory hardness — the function requires a large amount of RAM, which raises the cost for ASIC and GPU attackers. Used in some cryptocurrencies and by AWS Cognito for legacy migrations.

Argon2 (Biryukov et al.) won the 2015 Password Hashing Competition and is specified in RFC 9106 (2021). The recommended variant is Argon2id, a hybrid resistant to both side-channel and GPU attacks. Tunable parameters are memory_cost (KiB), time_cost (iterations), and parallelism. The OWASP-recommended baseline in 2026 is roughly 19 MiB, 2 iterations, parallelism 1 — calibrated upward so each verification takes about 250 ms on your hardware.

7. Side-by-side and decision tree

AlgorithmOutputStatus (2026)Use for
MD5128-bitBroken (2004)Non-security checksums only
SHA-1160-bitBroken (2017)Legacy compatibility only
SHA-256256-bitCurrentGeneral cryptographic hash
SHA-3 / BLAKE3256-bitCurrentHedge / high-throughput hashing
bcryptEncodedAcceptablePassword storage (legacy-safe)
Argon2idEncodedRecommendedPassword storage (new systems)
  • File integrity check, internal cache key → SHA-256, or BLAKE3 if speed matters.
  • Password storage, new system → Argon2id with OWASP-current parameters.
  • Password storage, existing bcrypt system → Stay on bcrypt at cost 12+, migrate only on next password change.
  • HMAC for API request signing → HMAC-SHA-256 (RFC 2104).
  • Anything labelled MD5 or SHA-1 in security-sensitive code → Migrate.

Generate a hash in your browser

DevToolNow's Hash Generator computes MD5, SHA-1, SHA-256, SHA-384, and SHA-512 entirely in the browser using the Web Crypto API. Useful for verifying downloads or generating non-security fingerprints.

Open Hash Generator →

Frequently asked questions

Q. Is it ever OK to use MD5 in 2026?

A. Yes, for non-security purposes. MD5 is fine as a content checksum (verifying a file downloaded intact), as a cache key, or as a fingerprint inside a system you fully control. It is unequivocally not OK for digital signatures, certificate fingerprints, password storage, or anywhere an attacker could benefit from finding a collision. Wang and Yu published a practical collision attack in 2004 and chosen-prefix attacks have been routine since 2008.

Q. Why is SHA-1 still in git?

A. Git uses SHA-1 as a content-addressable identifier, not as a security primitive against an adaptive attacker. The 2017 SHAttered attack (Stevens et al., Google + CWI Amsterdam) demonstrated a practical chosen-prefix collision but required substantial compute. Git has since added SHA-1DC (collision-detecting SHA-1) by default and is migrating to SHA-256, which has been an experimental backend since git 2.29 (2020). For new code, do not pick SHA-1.

Q. Should I use SHA-256 to hash passwords?

A. No. SHA-256 is fast — that is exactly what you do not want for password hashing. A modern GPU can compute billions of SHA-256 hashes per second, which means an offline attack against a leaked database can try every password in a 10-billion-entry breach list in seconds. Use Argon2id (RFC 9106) for new systems, bcrypt if you need maximum library availability, or scrypt if you have a specific reason. The OWASP Password Storage Cheat Sheet is the authoritative starting point.

Q. What does the work factor in bcrypt actually do?

A. bcrypt's cost parameter is a logarithmic exponent on the number of internal iterations. Cost 12 means 2^12 = 4,096 rounds of the underlying Blowfish-based key schedule. Each step up roughly doubles the time the function takes — both for legitimate verification and for an attacker. The OWASP recommendation in 2026 is cost 12 or higher, calibrated so a single verification takes roughly 250 ms on your production hardware.

Q. Is Argon2id really better than bcrypt?

A. Argon2id won the Password Hashing Competition in 2015 and is specified in RFC 9106. Its main advantage over bcrypt is configurable memory hardness — an attacker using GPUs or ASICs cannot trade memory for speed as easily, which is the exact scenario bcrypt was designed before GPUs were widespread. For new systems, Argon2id is the recommended default. For existing systems on bcrypt, migration is rarely worth the operational risk if your work factor is current.

References

  • NIST FIPS 180-4 — Secure Hash Standard (SHS)
  • NIST SP 800-107 Rev. 1 — Recommendation for Applications Using Approved Hash Algorithms
  • IETF RFC 9106 — Argon2 Memory-Hard Function for Password Hashing and Proof-of-Work Applications
  • OWASP — Password Storage Cheat Sheet
  • Stevens, Bursztein, Karpman, Albertini, Markov — The first collision for full SHA-1 (SHAttered, 2017)

Note: Cryptographic guidance shifts as attacks improve. Cross-check OWASP and NIST publications against the publication date of this guide before adopting algorithms or parameters in production.

About the DevToolNow Editorial Team

DevToolNow's editorial team is made up of working software developers who use these tools every day. Every guide is reviewed against primary sources — IETF RFCs, W3C/WHATWG specifications, MDN Web Docs, and project repositories on GitHub — before publication. We update articles when standards change so the guidance stays current.

Sources we cite: IETF RFCs · MDN Web Docs · WHATWG · ECMAScript spec · Official project READMEs on GitHub

Related tools on DevToolNow