Delete comment from: Ken Shirriff's blog
on a somewhat related note, if I recall the AAM (Ascii adjust for multiplication) instruction has a weird behavior, where even though it was only documented by Intel to divide by 10, the second byte of the instruction would let you encode an arbitrary value to divide by. I don't know if there's anything interesting in the micro-code involving this feature.
Jan 31, 2023, 2:02:15 PM
Posted to Understanding the x86's Decimal Adjust after Addition (DAA) instruction

