Settings

Theme

Reverse engineering a forgotten 1970s Intel dual core beast: 8271, a new ISA

scarybeastsecurity.blogspot.com

169 points by scarybeast 5 years ago · 39 comments

Reader

ajross 5 years ago

I love these die shot RE walkthroughs. This is such a weird choice though; almost no one used this chip for production hardware. The WD 17xx series was the king of floppy controllers in the 70's and early 80's (well, if you ignore Woz's masterpiece, which was discrete logic).

Seems like if you're going to reverse an obscure chip, something with a more exotic application would have been more fun?

  • scarybeastOP 5 years ago

    We definitely wanted fun! I guess there are two ways to slice and dice it: 1) fun because the application is exotic / iconic, or 2) fun because the chip is exotic. This is definitely a case of the latter: an exotic chip with interesting history.

    • bane 5 years ago

      The next goal should be trying to write a demo for it and enter it into Demosplash 2021.

  • rbanffy 5 years ago

    > Woz's masterpiece, which was discrete logic

    It had the downside of requiring precise timing. This is an issue if you wanted a faster (or even a slightly improved) 6502 in the machine.

    Oh well...

    • ajross 5 years ago

      None of the Apple II's competitors exploited that, though. EVERYTHING in that world of software was built around tight hardware assumptions. Atari, VIC, Commodore, none of them ever shipped a faster CPU. That needed to wait for later architectures.

      No, the Disk II was absolutely a masterpiece. Go back to the linked article, think about everything this very complicated chip did, and recognize that Woz made a card that did exactly the same thing out of 6 chips you could buy at Radio Shack and two tiny ROMs.

      • rbanffy 5 years ago

        I am familiar with the design. At the time, the only computers that were less timing dependent were the S-100 and CP/M boxes where you could swap out a CPU for a faster one if memory speed permitted.

        I don't consider it a mistake - nobody at the time was realizing that we all would, eventually, move our software to faster computers every couple years. Woz is brilliant, but he couldn't have foreseen it and, as you point out, almost nobody did.

    • pvg 5 years ago

      This made me realize that Zip chips were a lot more complicated than kid-me thought:

      https://en.wikipedia.org/wiki/Apple_II_accelerators#Zip_Chip...

      • rbanffy 5 years ago

        Those things were amazing engineering. Someone's probably making something like it for the retrocomputing crowd.

        With current tech it should be trivial to keep all of the RAM in the module and just asynchronously push some writes out (video and IO) at Apple II speeds.

    • nradov 5 years ago

      Back then no one really expected microcomputer hardware architectures to have a long lifespan or allow for incremental CPU upgrades every year.

lathiat 5 years ago

The floppy drive controller chip costing and being better than the main CPU reminds me of some of my arduino projects.

For example I was driving a GSM Modem (over serial) which contained a comparatively advanced ARM SoC driving it.. from an AT Mega 8-bit micro that I was programming. It was great educationally, but kindof hilarious :)

userbinator 5 years ago

A lot of special-purpose ICs are actually general-purpose processors with a mask ROM (or sometimes EEPROM, with interesting consequences), since writing the "firmware" for different functionality is easier than doing a whole "hard-coded" chip design --- the various USB-to-X adapters are one common example of this.

  • segfaultbuserr 5 years ago

    A few days ago, I brought an USB-to-serial chip made by an obscure manufacturer. I found the RX/TX ports were working but control signals were not, I poked around the chip with an oscilloscope and found those pins were high-impedance, the chip didn't even attempt to output anything. I contacted tech support. They told me to buy a new one because my chip was an old batch without this feature (I guess it's a bug), but they can also provide a firmware update to me under NDA.

    It's microcontroller all the way down.

    Fun fact: It's also how those fake FTDI FT232 chips on the gray market were made. Counterfeiters just picked a cheap general-purpose microcontroller in mass production and wrote a Mask ROM for it. What's funny is that, the counterfeit chips actually have better process node than the real one (it doesn't mean it's better, though).

    https://zeptobars.com/en/read/FTDI-FT232RL-real-vs-fake-supe...

    • marcan_42 5 years ago

      They are better though: the clones fixed bugs that the original FTDIs had (bugs bad enough to make certain modes completely useless).

      https://twitter.com/marcan42/status/695292366639378433?s=19

      Don't buy FTDI chips; their malicious driver incident that bricked clones (by exploiting another bug in the EEPROM write support in their own chips!) should be enough to convince every board designer to stay away from them.

      • segfaultbuserr 5 years ago

        The clone chips are not better, they could've fixed some bugs in the original chip but they may have their own bugs. While there may be some good clones, there's no way to confirm the chip you got is good. Better to avoid FTDI, real or fake, altogether. For the record, I never used or purchased any FTDI chips since that driver incident, the chip I mentioned was not a FTDI chip.

      • alyandon 5 years ago

        I needed a USB to serial converter and specifically went out of my way to ensure I did not purchase a FTDI based one exactly because of that malicious driver incident.

        • segfaultbuserr 5 years ago

          I have used products that are powered by Microchip MCP2200, SiLabs CP2102, Prolific PL2303, or WCH CH341, it's easy to stay away from FTDI ;-)

    • linker3000 5 years ago

      It's the same situation with USB Bluetooth dongles - many purport to be based on a Cambridge Silicon Radio (CSR - now owned by Qualcomm) chip, but are in fact using one of several cheap alternatives.

      Many of these clones have their own quirks, and the amount of work-arounding that's been added to the Linux drivers is quite notable. Kernel 8.5 onwards seems to incorporate most of the fixes, but along the way, patches like this were available:

      https://gist.github.com/nevack/6b36b82d715dc025163d9e9124840...

      Despite the fantastic efforts of the driver maintainers, I still managed to find a "CSR 4" cheap dongle that threw all sorts of errors in the logs and didn't work. I'm waiting for some Broadcomm chip-based replacements to arrive.

  • 737maxtw 5 years ago

    A large number of Realtek products are some form of MIPS core (or at least used to be).

  • 2sk21 5 years ago

    And there is a security aspect to this design decision too - if a way can be found to inject arbitrary code into one of these microcontrollers, all kinds of access will be opened.

__d 5 years ago

The decision to use this chip (vs the WD) in the BBC Micro made me wonder if there were other obvious-in-hindsight bad/weird/hackish choices in classic computer design?

The PC-AT's use of the keyboard controller to control A20 and CPU reset comes to mind.

Any others?

  • Jeema101 5 years ago

    Several video game consoles from back in the day (the Magnavox Odyssey 2 and Entex Adventurevision) used the MCS-48 microcontroller mentioned in this write up instead of more conventional CPUs of the time like the 6502 or Z80, which seems a bit strange.

    Having dabbled in both MCS-48 and 6502 assembly I can tell you it was probably not done to make software development easier. :) I suspect maybe the hardware was just easier to design around a microcontroller? ...but there could have been other reasons...

    • Marazan 5 years ago

      Both the 6502 and Z80 were designed/imagined for use in embedded systems, not the general purpose computers they became famous in.

rbanffy 5 years ago

My guess is that, at some point, a manager at Acorn asked which disk controller would the engineers recommend and they, obviously joking, said "8271".

When they realized what they had done, it was too late.

  • fredoralive 5 years ago

    If this in the context of the BBC Micro, Acorn had already used the 8271 for floppy support with previous “System” and “Atom” models[1], so it’s probably chosen through inertia from a previous design choice.

    As for the older systems, does anyone have a date for the WD 1771 (the more common alternative)? The data sheet shown on the Wikipedia page is dated April 1979, and if that’s the release date then it would have been a new, unproven part compared to the intel chip when the System and Atoms were released 1979/80, assuming the floppy drive was an early upgrade (again, I don’t have dates to hand).

    [1] http://chrisacorns.computinghistory.org.uk/8bit_Upgrades/Aco...

    • kens 5 years ago

      Google Books shows the WD1771 available and in use in 1977 with more products using it in 1978.

      (I find Google Books very useful for this sort of archaeology. Change the time range to "1977-1980", search for "wd1771", and then sort by date. Of course this isn't thorough, but it gives a good overview for little effort.)

      • fredoralive 5 years ago

        Thank you for the research. In that case I guess it’s more of a wrong bet on part choice in 1979 I guess.

      • codetrotter 5 years ago

        Off topic but I wish Google allowed doing the same for web results, to search their index the exact way it was some number of years ago so that it would only show results that had actually been indexed at that point in time.

    • jandrese 5 years ago

      Very believable that there was institutional inertia for using the chip that they already understood well instead of a brand new chip just appearing on the market. The new chip might be better but you could lose a month or three of development time integrating it, assuming you don't run across an unexpected bug.

      But at the same time it's no surprise that later models switched over to the cheaper and more technically capable chip.

intricatedetail 5 years ago

Is there a tech available currently to decap and photo chips at home? For example reverse engineering old PAL chips?

  • kens 5 years ago

    These chip come in a ceramic package, so decapping is a simple matter of tapping along the seam with a chisel. If you want high-quality die photos, you'll need a metallurgical microscope, which shines the light down through the lens. I got my microscope on eBay. Once you have a bunch of die photos, you can stitch them together with a program such as Hugin.

  • krallja 5 years ago

    Scanning just uses a microscope and open source panorama stitching software: http://www.righto.com/2015/12/creating-high-resolution-integ...

    Home decapping is possible, from what I can tell, but uses methods like “fuming nitric acid” and “blowtorch”, neither of which I would really want to experiment with right now.

    resilicon.reddit.com is becoming fairly active, if you have an interest.

dboreham 5 years ago

The D765 was a D765B, probably a shrunk version, hence the different die size.

mrlonglong 5 years ago

Got one of these in my Beeb, never knew it was a dual core monster. No bloody wonder they were so expensive back in the day.

jlarcombe 5 years ago

Incredible stuff!

Keyboard Shortcuts

j
Next item
k
Previous item
o / Enter
Open selected item
?
Show this help
Esc
Close modal / clear selection