Back to the Tech Future: The History of CES in 4 Iconic Gadgets
SOURCE:CNET|BY:David Katzmaier
The TV, VCR, game console and iPhone have rich beginnings and enduring legacies at the world's biggest tech show.
Envato/ Lily Yeh/ CNET
The TV, VCR, game console and iPhone have rich beginnings and enduring legacies at the world's biggest tech show.
David Katzmaier Senior Editorial Director -- Hardware
David leads the Hardware team at CNET, covering phones, wearables, laptops, TVs, streaming, future tech and more. We provide helpful, expert reviews, advice and videos on what to buy and how to get the most out of it.
Expertise A 25-year CNET veteran, David has been reviewing TVs since the days of CRT, rear-projection and plasma. He created CNET's methodology for testing TVs, streaming services and AI tools. Prior to CNET he wrote for Sound & Vision magazine and eTown.com. Credentials
Although still awaiting his Oscar for Best Picture Reviewer, David does hold certifications from the Imaging Science Foundation and the National Institutes of Standards and Technology on display calibration and evaluation.
17 min read
I'll never forget the biggest TV I've ever seen. Deep inside a convention center in Las Vegas, a PR representative for Samsung calmly ushered me past workers setting up for the evening event. They were preparing for Samsung's First Look, the annual unveiling of the company's most ambitious home entertainment hardware for the coming year. Hundreds of journalists and industry insiders would soon have access, but I was getting a behind-the-scenes preview.
We moved past kiosks in mid-construction devoted to PC monitors, smart TV features and wacky displays built into modernist bookshelves. I brushed by the Sero, a TV that could rotate its screen into portrait mode. Then, behind the half-constructed stage, I saw it: The Wall, 292 inches of micro-LED glory, brighter than any movie screen and so much larger than life.
That was at CES, the world's largest tech event, in 2020. Every year, Samsung is one of the show's most important exhibitors of consumer electronics, and I knew that its huge TV would be the talk of my industry. As it towered over me, I felt like I was part of technology history.
I've been attending CES for most of my adult life. With the exception of two years during the COVID pandemic, I've gone every year since 1999. I fly to Vegas in January, right after the holidays, to hustle for a solid week. There, alongside hundreds of other journalists and my CNET colleagues, I write articles and shoot videos about the coolest gadgets on the planet. Tough gig, I know.
CNET has a long history at CES. Teaming up with the Consumer Technology Association, which hosts the show, we've bestowed the official Best of CES Award on a handful of select products. We're doing it again in 2026, this time in conjunction with our colleagues at PC Magazine, ZDNet, Mashable and other Ziff Davis publications. The massive show is scheduled for the week of Jan. 5, and we've spent months planning how to tackle it.
Huge TVs remain one of the most recognizable symbols of CES, and they've only grown in significance since the introduction of HDTV broadcasts in 1998.
"HDTV was the biggest thing in my lifetime for video, no question about it," says Gary Shapiro, president of the CTA. "HDTV fundamentally changed the viewing experience."
But there's a lot more to CES than TVs. Over the years, the consumer electronics extravaganza has been where we first got a glimpse of technology that we use every day — game consoles, cutting-edge phones, even streaming services — as well as more futuristic tech, including humanoid robots, AI-powered laundry machines, and personal electric aircraft. CES is where thousands of companies debut their splashiest innovations, and it's one of the most important predictors of the next big tech trend.
And even though bellwether companies like Apple, Amazon, Google, Meta, and Samsung hype their own events and livestreams throughout the year to launch major products, CES has endured.
Other major trade shows have come and gone. Comdex, which ran from 1979 to 2003 and was also based in Vegas, showcased the computer technology of the day, competing directly against CES. E3, a massive video game industry event spawned from CES, took place annually from 1995 to 2021. A handful of international technology trade shows, including Mobile World Congress in Barcelona, IFA in Berlin and Computex in Taipei, are still going strong, but CES remains king.
We can see the influence of the show on TVs, VCRs, game consoles and PDAs. These four devices, each with a rich history at CES, have a tech legacy that continues to push forward. I expect to see a continuation of that evolution at CES 2026, along with other devices, services and technologies still in their formative stages.
The next wave of household robotics, autonomous mobility, AI-assisted health care and salt spoons will exist in a booth or display at this year's show. It might be years before they're affordable, accessible and useful enough to become part of our lives. The road from wild concept to household mainstay is long and fraught with many dead ends, but it often begins at CES.
CES 1967: In the beginning, there were TVs
The very first CES — at the time, it went by its full name, the Consumer Electronics Show — took place in New York City in 1967. It attracted 117 exhibitors, which is tiny by today's standards. By comparison, CES 2025 featured more than 4,500 exhibitors and drew 142,465 attendees, and pre-COVID shows were even larger.
The inaugural CES was a spinoff of another technology-focused trade event, the Chicago Music Show, where audio technology showed up in the form of pocket radios.
Tube TV/ Lily Yeh/ CNET
From its advent, CES featured TVs. Invented long before the 1960s, that decade saw the adoption of color television broadcasts in the US and the launch of TV satellites.
At the 1967 CES, the most cutting-edge TVs displayed were those with integrated circuits, a technology that combines multiple electronic components into a small "chip," or microchip — the stuff that eventually would become the guts of every laptop and smartphone. During that show and for decades afterward, most TVs used a cathode ray tube, or CRT, which made the screens small and the sets heavy. When we were kids, my sister and I sat a couple of feet from our tube TV at home watching cartoons, despite our parents' warnings that sitting too close would ruin our eyesight.
No matter the era, the ideal TV is always something bigger, something that promises to bring the immersive, magical feel of a movie theater into a home. During my career, I've watched TVs expand and improve, year after year, with higher resolution, better contrast, more realistic color and brightness, chasing reality in fidelity and size.
A 2025 CNET survey found people do indeed crave huge screens. Nearly half of the respondents said that if money were no object, they would want a TV bigger than 65 inches in their homes.
"A lot of people ask what size TV should I buy, and I always tell people to buy one size bigger than you think you need," Chris Hamdorf, executive vice president at TV maker TCL, told CNET in 2025. As a TV reviewer, I give people the same advice, but there was a time when 65-inch TVs were far from common.
During the '80s and '90s, larger-screen televisions hit the market using a technology similar to movie projectors. Called rear-projection TVs, the projector was housed inside the TV cabinet and created an image from behind the screen. Then, they also used CRTs, and in later iterations, an acronym-heavy array of other technologies (think DLP, LCD and LCoS). Numerous rear-projection TVs defined my first few years covering CES, before they were replaced by flat-panel technology, often thin and light enough to hang on a wall, a harbinger of the screens we use today for our Netflix binge-watching.
Plasma technology arrived in 1995 with the world's first large (42-inch) plasma display by Fujitsu, and at CES 1997, Philips showcased the first commercially available version. The technology evolved in succeeding years but remained expensive by today's standards — in 2005, Toshiba sold a 42-inch plasma for $4,500, for example. Just a few years later, plasma hit mainstream pricing and became very recommendable.
In 2010, the Best of CES award went to the first plasma TV with 3D capability, the Panasonic V10, chosen by a group of CNET journalists, including myself. We met in CNET's double-wide trailer in the conference center parking lot and debated our way to the best overall winner. I touted the superb picture quality of previous Panasonic plasmas I'd reviewed, along with a promising new 3D video, complete with glasses. Oops! Within a few years, that concept was a walking corpse, and I commented on its death in 2017
LCD-based displays were evolving at the same time, and that technology soon outsold both plasma and other non-flat technologies. With the advent of 4K resolution, plasma technology became less popular and eventually left the market altogether by 2014. LCD has been the dominant TV technology ever since.
Nowadays, CES is awash in massive screens, although none are quite as large as the 292-inch micro-LED TV that impressed me so much. At CES 2024, I was particularly enamored by the 132-inch, $200,000 folding TV by C-Seed. LG's booth is another impressive example, with its incredible OLED multiscreen displays. But innovation in TVs has certainly slowed down, as larger TVs with excellent image quality have become increasinglyaffordable over the years.
"To be honest with you, the importance of TV at CES is definitely diminished," CTA's Shapiro says. "Because it is such an amazing consumer product that it's almost cheaper than wallpaper now."
CES 1970: A $13,000 VCR sets the stage for cheap streaming
Google Images/Envato/Lily Yeh/ CNET
For as long as TVs have existed, they have seemingly delivered the same basic concept: a screen with moving video and sound that you watch for entertainment at home. Other groundbreaking technology ideas, however, have evolved significantly over a short period of time.
In 1970, just three years into the history of CES, Philips showcased the N1500 VCR. It would be the first device that recorded TV shows onto cassette tapes. Originally a piece of professional broadcast equipment, it hit the UK market in 1972, where it sold for £600 — the equivalent of $13,000 today. It had a built-in TV tuner to record television programs broadcast over the air, as well as an analog clock that automatically initiated recordings.
"The VCR was important on so many different levels," Shapiro tells me. "It changed the concept of TV. It empowered consumers to choose what they want to watch and when they want to watch it."
The idea that you could "time shift" to watch a show at a later time was revolutionary, eventually transforming home entertainment forever. Until then, television programming had set broadcast times. To experience "appointment TV," you had to follow a show's schedule at the moment it aired. With the ability to record and archive video independently, people took more ownership of their entertainment.
The format used by the N1500 was actually called "VCR," but it was never successfully marketed in the US, opening the door for two other formats: Betamax (introduced by Sony) and VHS (developed by JVC). Betamax hit the market in 1975, the year I was born, and was seen by many as technically superior to VHS, with better image quality. My father was a Beta guy, and took great pride in his collection of Disney movies recorded off-air.
VHS launched at CES in 1977. It used a larger cassette tape than Beta and promised longer recording times (2 hours versus 1 hour). Over the next few years, the two incompatible formats and their devices — both now called VCRs — battled it out in the market, one-upping each other in marketing, brand support and technological innovation, such as recording lengths.
"There was a format war going on between VHS and Beta," Shapiro says. "And it was intense."
Over time, Betamax sales declined as more households adopted VHS. By 1988, 170 million VCRs had been sold worldwide, with only 13% being Betamax models. Sony also announced that it would manufacture VHS VCRs. The format war was effectively over.
The VCR had a 40-year reign, characterized by Blockbuster video rental stores and their reminders to "be kind, rewind." But the technology of home video was about to get a digital makeover.
The DVD format delivered superior image quality in a smaller, more durable disc that didn't require rewinding. It also allowed recording via DVD-R discs. Around the same time, a disc-free device made its debut: the DVR. It proved much more popular than DVD-R for recording TV shows and movies.
TiVo and Replay TV were among the first DVRs, devices that stored hundreds of hours of TV shows and enabled automatic recording to a hard disc. With a DVR, the "work" of programming recordings was much easier. You could simply indicate that you wanted to record every new episode of The Simpsons, and the DVR would do it automatically. There were no discs to bother with, so you didn't have to worry about damaging them.
DVRs also allowed you to fast-forward through commercials and skip ahead in 15- or 30-second increments. Some even included the ability to skip past commercials automatically, without having to press a button at all. One of those DVRs, the Dish Network Hopper, debuted at CES and was named the 2013 Best of CES winner by CNET.
Until it wasn't. The company that owned CNET at the time, CBS, was in the process of suing Dish over its commercial-skipping capabilities. CBS intervened in the awards process and instructed CNET's editorial staff to select an alternative winner instead.
"When I heard that CNET gave an award and CBS reversed it, that must have been devastating to the staff," Shapiro tells me. "Then I realized this is, like, a gift. This is gonna get more publicity than ever." He wrote a column for USA Today headlined "CBS orders crush CNET credibility." The reversal sparked a controversy that CNET veterans like me remember as an example of corporate interests overstepping editorial integrity.
DVRs remain a staple in US households today, typically sold by cable TV companies. But as more Americans ditch their cable subscriptions and replace them with streaming services, DVRs have moved to the cloud. The first live TV streaming service, Sling TV, debuted at CES 2015 with a $20 package that included channels such as ESPN, CNN, TNT and Disney Channel. I said at the time that it stole the show and presaged a new era of cutting the cable TV cord.
Today, Sling and its rivals — YouTube TV, Hulu Plus Live TV and more — all offer cloud DVRs. They let you record TV shows automatically and watch them whenever and wherever, and even let you fast-forward through commercials. But you'll have to press a button to do so.
Beyond TV shows and movies, another kind of entertainment shares a rich history of CES debuts. It got its start on television at home, but has moved rapidly into portable formats and even virtual reality. I'm talking about video games.
CES 1977: The Atari 2600 is whatever happened to Pong
Debuting at CES on June 4, 1977, the iconic Atari 2600 console launched the home video game industry. The announcement of the console was a bit of a surprise. Attendees at the show expected the big news to involve the debut of the VHS format.
The 2600 isn't the first home console (a distinction that belongs to the Magnavox Odyssey in 1972), but Atari's was the first to go mainstream. Atari was founded in 1972, and its breakthrough game, Pong, is widely considered the earliest successful video game. Originating as a stand-up arcade game, Pong then made its way to home consoles, including the Home Pong, a TV-connected console that was introduced at CES in 1974.
Highlights for n00bs: Looking back on 40 years of gaming
Atari employees included Steve Jobs and Steve Wozniak, who soon went on to found Apple. In 1974, they had teamed up to develop another seminal game, Breakout. Atari was sold to Warner Communications in 1976 for $28 million to fund the development of a device code-named VCS (Video Computer System), which was eventually renamed the Atari 2600.
Video games were new at the time and playable on dedicated arcade machines. Much like the VCR, a home game console was a TV accessory that opened up another world without requiring you to leave the house. Allowing users to interact with the screen and control the contents, as well as compete against another player, was groundbreaking.
Gaming has since evolved to encompass numerous other systems and platforms, extending beyond the confines of TVs to include computers, phones, VR headsets and more. It's everywhere and more popular than ever, and CES has played right along.
"We were very important to the game industry. We had Nintendo and Sony and Sega," he says, adding, "I remember the guy from Atari was on our board."
The 2600 was a big hit, outselling its initial production run in 1977 and eventually selling 30 million units worldwide. My uncle and aunt bought it for the family one Christmas, and I vividly recall hours spent with my younger cousins — and the adults — as we sat mesmerized, kicking butts in Defender, Combat and Space Invaders. Safe to say I was hooked on gaming, along with millions of other people.
Atari is still around today and even released the throwback Atari 2600 Plus a couple of years ago, but other companies and devices have dominated gaming since the early 1980s. In 1985, Nintendo unveiled the NES, or Nintendo Entertainment System, at the June CES. I bought the console with my paper route money, and hid it from my father, who didn't approve of video games.
Nintendo is regarded as one of the most influential game devices ever. It featured add-ons like the Zapper light gun and launched Nintendo's best-known franchises, including Mario, Metroid and Zelda. Numerous TV-based and handheld successors followed, culminating in the Nintendo Switch, the first hit console to combine both at-home and portable gaming into a single device.
"CES always felt a little off-timed for gaming compared to E3's late-spring gaming reveals, but it's still been a place where new gaming tech has pushed the envelope," says CNET Editor at Large Scott Stein, who's been attending CES shows since 2004. "The Razer Edge gaming tablet gave a preview of where the Switch was later heading, all the way back in 2013."
Another modern game console also got its start at CES. In 2001, Bill Gates, CEO of Microsoft, revealed the final design of the computer company's first foray into console gaming — and the first major console produced by an American company since Atari. It was called the Xbox.
In a memorably over-the-top CES keynote address, Gates took the stage with Dwayne "The Rock" Johnson, then a professional wrestler, to hype up the device. Gates pulled aside a black cloth with the words: "For the first time, let me unveil Xbox." The striking black monolith, with neon green highlights, had a big "X" embossed on top and a massive wired controller.
Google Images/ Envato/ Lily Yeh/ CNET
Microsoft aimed to compete with Sony's successful PlayStation console and added similar features to the Xbox, including a broadband connection and the ability to play CD-ROMs and DVD movies. The Xbox was the first console with a hard-disc drive, which presaged the modern versions of gaming consoles. (Both the current PlayStation 5 and Xbox Series S lack external disc drives altogether and rely on games played from the hard disc.)
The Xbox was also massively successful, in part because of the popularity of one of its launch titles — Halo: Combat Evolved — but it never matched the popularity of its direct competitor, Sony's PlayStation 2. The Xbox versus PlayStation rivalry continues to this day, with Microsoft and Sony trading exclusive titles, buying game studios and matching one another spec for spec.
The original Xbox was the last major console announcement at CES. The show's relevance for gaming was eclipsed by E3, a competing convention in Los Angeles that attracted game developers and other hardware makers.
"That's definitely one of my biggest career failures, when we lost that segment," Shapiro says when I ask about gaming. "We made some bad decisions, and they created E3, which has gone through a life cycle of its own."
Nonetheless, CES has remained an important venue for gaming hardware debuts. PC makers, chipmakers like Nvidia and VR and AR companies, including Oculus — now folded into Meta — still debut products in Vegas. Meanwhile, big gaming companies launching consoles have opted to dribble news and rumors out over months, culminating in dedicated events like the PS5 showcase in 2020 and Nintendo Direct for the Switch 2 earlier this year.
The move toward launching important tech products at separate, company-specific events has certainly diminished the importance of CES over the years. No company illustrates that trend more than Apple.
CES 1992: Apple drops Newton, a failed precursor to the iPhone
Much like its on-again, off-again relationship with gaming companies, CES isn't really a phone show today. That title belongs to the Mobile World Congress. Yet even the MWC plays second fiddle to the individual phone launches put on by big mobile companies: Samsung Unpacked, the Google Pixel event and, most importantly, the Apple iPhone event. That's where prospective phone buyers and tech journalists gather to get all the details about the year's newest mobile gear.
In 1992, 15 years before Steve Jobs announced the iPhone at Apple's press event, the company attended its first CES. Apple's CEO at the time was John Sculley, and the device he unveiled was called the Newton MessagePad. Sculley hailed it as "nothing less than a revolution," and it marked the computer company's first new product line since the introduction of the Macintosh.
The Newton was incredibly ambitious at that time, and it's not difficult to see a straight evolutionary line from the MessagePad to the iPhone. Apple called it a PDA, for personal digital assistant. The Newton was a handheld and portable device, dominated by a large screen, and was designed to help users take notes, organize contacts, calendars and more. It allowed people to read ebooks more than a decade before Amazon launched the Kindle. A Newton advertisement boasted: "Send faxes without paper and receive pager messages and email."
Ultimately, however, the Newton was a market failure. Its chief feature was handwriting recognition — the device could convert words written on the screen with an included stylus into text. That feature didn't work well, often failing to accurately convert even simple words to text, and was famously skewered by the Doonesbury comic strip. For a glorified notepad, the Newton itself was way too expensive, starting at $700 when it hit the market in 1993, which would be more than $1,500 today.
Other PDAs at the time included the IBM Simon and the Nokia 9000, both of which featured early cellular phone functionality. Devices like BlackBerry and handhelds running tiny mobile versions of Microsoft Windows also appeared around the mid-1990s, but early smartphones — basically, PDAs with cellular technology built in — quickly overcame them. One of the most buzzworthy products of CES 2009 was the Palm Pre, a smartphone using the company's brand-new WebOS mobile software. The Pre won CNET's Best in Show and the People's Voice Award, cementing its place in CES history.
"Palm knew exactly what it was doing using CES to launch a comeback phone with a daring new OS," recalls Jessica Dolcourt, now CNET's vice president of content, who was, at the time, an editor covering mobile technology. "It was a brilliant play that said the Pre wasn't 'just' a phone — it was as consequential and dazzling as any TV or gaming laptop."
The Pre brought something fresh and new to smartphones at a time of tremendous difference and diversity, Dolcourt says. "I could not wait to get my hands on it."
Apple, meanwhile, made sporadic appearances at CES but increasingly seemed to regard the sprawling, splashy event as a direct rival for its attention in the tech world. One of my most vivid CES memories was in 2011, when we learned that the iPhone was coming to Verizon. Apple made the massive announcement in New York during CES, completely upstaging the Vegas convention. In later years, Apple appeared at CES to discuss privacy and introduce AirPlay to TVs, among other initiatives, but none of its CES announcements could compare to the impact of the Newton.
"John Sculley was a keynote speaker," Shapiro says. "Steve Jobs never was. And when I asked him about it, he said, 'Love to keynote. Just move it to San Francisco and call it Macworld.'"
CES 2026: What's next for tech history?
If there's any lesson I've learned from CES after all these years covering the show, it's that flashy tech ideas can take longer than you might expect to become a part of our everyday lives, if they do at all. In each of the cases above, the devices that were first introduced did not immediately revolutionize the market, or all by themselves. It took years and intense competition to figure out a "winner."
The 2026 edition of CES is about to kick off in Vegas for the show's 59th year. As tech giants hold their own events, and innovations shift increasingly from the world of physical hardware (phones, laptops and TVs) to digital software (apps, social media and AI), the decades-old question arises: Does CES even matter anymore?
If you ask the 150,000 people expected to attend this year, the answer is beside the point. CES is here, steeped in history, and it's sure to be packed with futuristic, ambitious and weird new technology. It's almost certainly going to be around next year, too.
So I say pass the impossible lobster and point me to the flying robot AI cars.