Zilog Ends 48 Year Run of Classic Microprocessor

Benj Edwards at Ars Technica:

Last week, chip manufacturer Zilog announced that after 48 years on the market, its line of standalone DIP (dual inline package) Z80 CPUs is coming to an end, ceasing sales on June 14, 2024. The 8-bit Z80 architecture debuted in 1976 and powered a small-business-PC revolution in conjunction with CP/M, also serving as the heart of the Nintendo Game Boy, Sinclair ZX Spectrum, the Radio Shack TRS-80, the Pac-Man arcade game, and the TI-83 graphing calculator in various forms.

In a letter to customers dated April 15, 2024, Zilog wrote, “Please be advised that our Wafer Foundry Manufacturer will be discontinuing support for the Z80 product and other product lines. Refer to the attached list of the Z84C00 Z80 products affected.”

Designers typically use the Z84C00 chips because of familiarity with the Z80 architecture or to allow legacy system upgrades without needing significant system redesigns. And while many other embedded chip architectures have superseded these Z80 chips in speed, processing power, and capability, they remained go-to solutions for decades in products that didn’t need any extra horsepower.

Being in gaming and general tech circles these days you’d think chips were often made only for a short period and then the industry moves on. That’s simply not true and it’s easy to forget that chips like these still get made. I graduated college in 2018 and took a literal ton of math classes, as one does, while getting my Comp Sci degree. Let me tell you, that TI-83 calculator is till very much alive and well. I also still drop a quarter or two into a Pac-Man game anytime is see one.

It’s always interesting to me when one of these articles about older chips like this comes up. I never knew what was in the things I used growing up and without articles like this I’d probably never know. I just knew it worked.

The 8-bit Z80 microprocessor was designed in 1974 by Federico Faggin as a binary-compatible, improved version of the Intel 8080 with a higher clock speed, a built-in DRAM refresh controller, and an extended instruction set. It was extensively used in desktop computers of the late 1970s and early 1980s, arcade video game machines, and embedded systems, and it became a cornerstone of several gaming consoles, like the Sega Master System.

It’s crazy to think that a chip designed in 1974 (!) and that’s older than me powered so much for so long.

> ▍

The GameBoy Is A Crazy Feat of Engineering That Almost Didn’t See The Light Of Day

Man this really took me back. I have very vivid memories of playing Super Mario Land at my grandmother’s house when I was a kid. I probably played more Tetris on my Gameboy than anything else.

It’s always crazy to me how these old school developers were able to not just find a way but make the hardware do things it wasn’t supposed to do while squeezing every last bit of performance out of it. It’s also crazy how close to death some of these projects got until that one crazy idea was found that gave them the way forward.

I love these hard engineering stories, especially when it comes to games. Games, and by extension game developers, typically push hardware far beyond it’s limits. They refuse to accept the limitations of the time, often pushing boundaries and moving everyone forward with them.

If you’re interested, I highly recommend Fabian Sanglard’s THE BOOK OF CP-SYSTEM, Wolfenstein 3D, and Doom Game Engine Black Books. Absolutely worth the read if you’re curious about how these classic games got made.

I also recommend the Ars Technica War Stories playlist on YouTube as well.

> ▍

These Intel KS CPUs Are Stupid And Are Making Intel Look Incompetent

Tim essentially nails it in the opening minutes of the video. These chips aren’t interesting or impressive and I’m not sure why Intel keeps embarrassing themselves for a cheap “win” against AMD with these.

If anything it proves that just because you can do something with brute force doesn’t mean you should. I know I’ve been ragging on Intel for quite a while now but they deserve it, at least on the CPU side of the house. They’re not innovating, and they’re barely even competing at this point. The wattage to performance on Raptor Lake / Raptor Lake Refresh was already not good and someone at Intel thought it would be a good idea go and release this? I know these are supposed to be for “enthusiasts” and maybe over clockers but there is no justification for this sku to see the light of day. Any enthusiast that wanted a top end Intel chip already bought one months ago. Even if I were in this market I’m not sure what this offers me that a regular old 14900K does not other than bragging rights that I’m not even sure are good ones.

The power draw is insane (Base is 150W! Max is 500W!). The price is insane ($690 MSRP!). The performance is… not insane (~2-5% vs 14900K!). At least compared to other CPUs.

They had better hope Arrow Lake delivers on it’s performance promises and helps get that power consumption under control.

> ▋

Tell Me You Copied AMD Adrenalin Without Telling Me You Copied AMD Adrenalin

Kevin Pury at Ars Technica:

But perhaps most importantly, Nvidia’s new app allows you to update the driver for your graphics card, the one you paid for, without having to log in to an Nvidia account. I tested it, it worked, and I don’t know why I was surprised, but I’ve been conditioned that way. Given that driver updates are something people often do with new systems and the prior tendencies of Nvidia’s apps to log you out, this is a boon that will pay small but notable cumulative dividends for some time to come.

When I built my new system last spring, this was one of the things I had heard about with AMD drivers: there is no login. Once I heard this I immediately questioned why this was a thing I had to do with Nvidia drivers. There’s absolutely no valid reason for it other than data harvesting. I’m glad to see it go.

This new driver app really does look like it does everything Adrenalin does and apparently has done for years. This is only in beta but the final version should be way better than the old GeForce Experience app ever was.

If you have an Nvidia card and are daring enough to use beta software download it and give it a try.

> ▋

Apple Catches Everyone Offguard By Actually Using The Pro Features Of The iPhone. Confusion Ensues

Jess Weatherbed writing at The Verge:

It’s a neat way to promote the recording quality of iPhone cameras, but it’s not like everyday folks can recreate these kinds of results at home unless they happen to own a shedload of ludicrously expensive equipment. The gear shown in the “Scary Fast” behind-the-scenes footage is fairly standard for big studio productions, but Apple’s implication with these so-called “shot on iPhone” promotions is that anyone can do it if only they buy the newest iPhone.

Jess can be as cynical as she wants but the simple truth is that nobody would’ve known the keynote was shot on an iPhone if Apple hadn’t told us so. It’s a testament to the iPhone that it’s so good and puts such powerful equipment in anyone’s hand that they can produce real, professional footage if they know how to properly use it just like any other professional level tool. Will the average user get the results Apple did with the keynote? Probably not but teven a pro wouldn’t get those results using it the way the average user would. The average person isn’t a professional and that was a pro level shoot. BUT they have the ability to if they choose to learn to use it that way and that is absolutely incredible.

I dabble in the streaming space where a lot of people think they need to spend insane amounts of money to make their stream or videos look good. They don’t. They do need a few things and the one thing any professional will tell you is more important than any piece of gear you can buy is lighting. You would be amazed at how much a relatively cheap key light or a home made one helps, yet it’s the last thing people think of.

For Apple, the “pro” moniker really means professional level. It’s not just a spec bump and a lazy up sell. Good on Apple for putting their money where their mouth is.

> ▋

Call of Duty Is Really Betting On The Nostalgia This Year

This ad tries, and mostly succeeds, to get that same feel as the original from back in the day. Everyone still plays COD and when you enter “The Lobby” you never know who you’re matching up against and given the way matchmaking works these days it does feel like a big nightclub. Like the original it also doesn’t take itself seriously and even makes fun of the absurdity of the things you end up with in game as a result of the serious operators and weapons mixed with the bright and ridiculous bundles in the store.

To me, the original “there’s a soldier in all of us” ad was one of Call of Duty’s best over the years and was the moment it was the moment COD was cemented as not just a gaming phenomenon but a cultural one. Everyone plays COD and this ad put that on display. I guarantee that if you don’t play COD you know someone that either does or has in the past.

But man, with Modern Warfare 3 (2023) launching this year with no new original maps (just remakes of classic Modern Warfare 2 (2009) ones), map voting, teasing the whole Makarov thing, and now with rumors that more “fan favorites” will return over the first two to three content seasons, they must be hoping that nostalgia is strong enough to carry this year. People are going to figure out very quickly, if they haven’t already, that this is little more than a full priced expansion pack for Modern Warfare 2 (2022) with changes to address complaints about movement and ttk from streamers and creators that don’t want to look bad when they get out played.

> ▋

The Scariest Or Most Interesting Thing About The Microsoft FTC Leak Isn’t The AMD And ARM Stuff Or That It Thought About Buying Nintendo. It Thought About Buying Valve.

Tom’s analysis on the Game Pass situation is pretty interesting and honestly probably the best breakdown of numbers that nobody seems to be talking about. Game Pass is in trouble and Microsoft knows it. It’s one the two saving graces of the Xbox Series consoles with the other being backwards compatibility all the way back the Xbox 360, if not even the original Xbox. Unfortunately for Microsoft the numbers are clear: people prefer to own their games (well, as best you can own them digitally anyway).

In the talk around the Game Pass numbers Microsoft knows they have to bump subscriptions and hard. Tom gets right up to the heart of the issue but I feel like he doesn’t quite see the bigger picture. How would Microsoft goose the numbers on Game Pass? Nintendo is the shiny distraction here while Valve is the real cash cow. But what does Valve own that could interest Microsoft to the point they would want to own them? The answer is simple: Steam1. Here’s the thing: almost every PC gamer I know buys their games almost exclusively from Steam unless for some reason it just isn’t available there.

Tom asks what Phil Spencer is smoking. I know exactly what Phil Spencer is smoking. Yes, buying studios and publishers allows Microsoft to retain big titles for some kind of exclusivity. That’s expected and unfortunately in the FTC’s eyes, a problem for down the road. That will only get them so far though. Take the studios out of the equation for a moment and tell me: what happens if Microsoft simply controls all the storefronts? You don’t think they’re going to try to goose those Game Pass numbers? Take away the crazy deals Steam is known for and and swap it out for Game Pass. This is one of the scenerios I see, forcing themselves even more on PC gamers that already want nothing to do with them. That’s worse than them buying all the publishers or studios. At least when they own the studio or publisher, they either put it in the Microsoft Store exclusively (where almost no one will buy it) or on Steam (where people will buy it). The options suck but at least this way theres a choice.

What I feel like Microsoft’s end goal could be here is to replicate the Apple/Google app store experience across all of gaming. That works for phones, tablets, and even game consoles but it’s a tough sell on an open platform like the PC.

And we aren’t even getting into the Steam Deck. You know, the sorta kinda hand held PC that runs Windows games on Linux thanks to the Proton compatibility layer that I’m sure Microsoft would love to get their hands on and kill.

> ▍


  1. For anyone who may not know, Steam is the biggest PC gaming storefront. There are others, EA has Origin and Activision Blizzard has Battle.net, but Steam is by far the oldest and the largest. They’re also known for their incredible deals. Their seasonal and random mid week sales often have such deep discounts that I’m surprised no publisher has publicly complained all these years. Literally no one I know that games on PC unironically buys games from the Microsoft Store in Windows.  

> John Romero To Receive Lifetime Achievement Award At GDC. Mabel Addis to Receive The Pioneer Award

Game Developers Conference:

“This year, the Game Developers Choice Awards will recognize two of the most impactful game development talents in history, Mabel Addis and John Romero.” said Stephenie Hawkins, Director of Event Production for Media & Entertainment at Informa Tech. “The Game Developers Choice Awards are proud to honor two artists with wildly divergent career trajectories, who worked decades apart but shared a creative passion and ingenuity that would help define entire game genres for decades.”

Mabel Addis is recognized as the first female game designer, but that title alone does not convey the breadth of her pioneering work. As the lead designer of 1964’s The Sumerian Game, she helped pave the way for game elements that wouldn’t become mainstream for decades. Among the innovations she helped conceive were game updates, in-game narrative experiences, and early iterations of what would become known as cutscenes—which, in 1964, took the form of photo slideshows accompanied by synchronized audio. The Sumerian Game itself predates modern display technology and instead used a computer printer to express dialogue and prompts, which took the form of an in-game narrator/character who conveyed game information and asked questions of the player.

I didn’t even know who she was until I read this. Amazing to think that everything we consider fundamental for today’s games came from her and way before they were even possible.

Makes Romero’s accomplishments seem somewhat small by comparison, even though he helped define an entire genre.

John Romero will be honored with the Lifetime Achievement Award for his work developing more than 100 published games, which include such genre-defining classics as Wolfenstein 3D, DOOM and Quake. As an early indie game developer, Romero’s first game development experience started in 1979 on a computer mainframe before moving to the Apple II in 1982, working as a completely self-taught programmer, designer and artist. He is considered to be among the world’s top game designers, with previous works that have won over 100 industry awards, and a range of development experience in the PC, console, mobile and mod space. Romero is an advocate and supports diversity in the game industry, particularly among Indigenous and Latine youth. He has co-founded 10 game companies, including id Software and Romero Games, which celebrated its seventh anniversary in 2022.

> ▋

> Id Software Turns 32

These guys made me the gamer I am today. These guys are also why I’m a software engineer.

When I first got my hand on Wolfenstein 3D at a friend’s house back in high school I was intrigued. It seems like a different kind of game and I wanted to give it a go. But it strangely didn’t hook me right away. It wasn’t until Doom that I was absolutely hooked. But back then Pcs were not cheap. Graphics cards didn’t exist. Windows wasn’t even a thing yet. It was around but it was in its 3.0/3.1 phase where you still booted to DOS first.

Computers, as gaming devices, were still kind of primitive. These guys cracked the code, opened up a whole new genre, and pushed what was possible at the time to its limit. We have graphics cards because these guys pushed hardware to to its limit.

Whenever someone asks me about tech idols I usually give two people: Steve Jobs and John Carmack. Steve often gets an eye roll because, well, he’s Steve. Carmack always gets a “Who?”, to which I remsond: You’ve at least heard of a game called Doom, right? At that point any nerd who knows just kind of gets it.

Absolute fucking legends. All of them.

> ▋

> A Dumpster Fire Of A Processor

Picking up right where we left off…

So basically you get about an extra 2% for the same power consumption as the regular Core i9 1300K. Sure the DDR 5 memory helps but right now it’s really, really expensive. $269 expensive and that’s with a $180 discount as of this writing. And it doesn’t even help that much.

No wonder Intel didn’t sample this to reviewers. I’d be embarrassed by this too.

This is their answer to AMD’s V-Cache? Oh boy…

ThisIsFine.jpg

> ▋