Loading

By Editor Morten B. Reitoft 

When I was 14, I got my first computer—a Commodore 64. It was a start of a new era, not only for me but also for the generations to follow. Some of the people I have interviewed, today active in the graphic arts industry, started back in the early '80s with these first-generation home computers. Sinclair ZX81, Spectrum, BBC Acorn, Commodore VIC20, Commodore C64, Armstrad, and several other 8-bit computers, launched the mass population of computers. Before the above computers, the only CPUs in the domestic area were calculators, simple appliances, and game consoles like Atari 2600 and Philips G7000. Microchips and semiconductors started a fantastic journey on how computers became an everyday household item.

These computers had so many limitations in their performance that developing games and even tiny business applications pushed these machines to the limit. Yet, millions of them were sold.

My generation started to learn to program, understand the logic behind computers, and have developed computers into many applications and devices that today are the Internet of things. Everything from refrigerators, coffee brewers, cars, televisions, phones, and practically everything we use today has become super dependent on technology - and the revolution has only started!

Computers have become the most critical technology for solving current and future problems. Ever since the first computers became mainstream, the processors have become faster and faster. The home computers, and even the first business computers, used a Motorola 6800/Z80 and later the 68000 processor or the MOS Technologies 6501/6502 and later the 6510. The Z80 was developed by the company Zilog back in 1974. Zilog still produced the Z80 chip used by Texas Instruments in some of their calculators. The Z80 became one of the most used CPUs in the computers of the time, but in competition with Intel's 8080, Motorola's 6800, and MOS 65XX series chips.

The 68000 by Motorola became the preferred CPU for Apple and its Macintosh Computers. The Z80, the 65XX, the 8080, and the 6800 were all CISC-based (Complex Instruction Set Computer) 8-bit chips.

The MOS 6510 became extremely popular as it was used in the Commodore C64, which became one of the best-selling computers ever. Intel was founded in 1968 by Gordon Moore and Robert Noyce. The early processors are, in today's perspective, simple. Still, by enabling the transformation from transistors into semiconductors, computer power suddenly got accessible for everything from clocks to electronics, calculators, etc. With mass production and mass volume, the prices of chips decreased, making them even more accessible. Just a few decades earlier, IBM's President Thomas J. Watson said in the early '40s that he believed the world would only need about five computers. The paradigm changed entirely, and microchips became a necessity for many companies.

The Intel architecture, known as X86, has developed since its invention but is still found in the vast majority of computers today - we know them as i3, i5, i7, and so forth, and today billions of transistors are found on a tiny piece of silicium. The production of chips is limited to a few specialized semiconductor companies. Though we depend on chips, it's strange to think that only one manufacturer exists in the world, located in Veldhoven, Belgium. The name of the company is ASML.

As the processors get faster and faster, they also require more energy, and computer chips use an insane level of energy leading to excessive heating, which again leads to problems of scaling chips to faster. I mentioned CISC earlier. When Apple, in collaboration with IBM and Motorola, developed the PowerPC - the idea was to create a RISC-based CPU (Reduced Instruction Set Computer). The CPU had fewer instructions and required more processes to perform the same task, but they were WAY faster in return. Apple opened its OS to other manufacturers, and companies like Compaq, later acquired by HP, could produce Apple clones. This was at the time (1985) when Steve Jobs was kicked out of Apple. He then established NeXT which pushed the limits of computers and, among other things, kickstarted the animation of films like Toy Story. I mention this as the operating system NeXTSTEP, built on a solid UNIX core, became what we today know as MacOS.

Steve Jobs returned to Apple, and a new era for Apple and the personal computer was born. First came the iMac, iPod, iPhone, and later the iPad. Products that changed the world and continue to change millions of people's lives as these devices get faster and faster. Apple barely made it and survived only because Bill Gates and Microsoft invested in Apple and guaranteed support for MS Office Mac editions. In the '90s, Microsoft faced problems after being sentenced to Antitrust behaviors.

In 2006 Apple changed processors from PowerPC-based CPUs to Intel, a move many fans were worried about. However, Apple switched to Intel and used technology known as Rosetta to transform as seamlessly as possible - which was again done in 2020, when Apple decided to ditch Intel and move toward their chips - based on ARM technology.

The new ARM generations challenge Intel and other CISC-based architectures, i.e., AMD. Though Microsoft and others have used ARM for a long time, Apple pushed the envelope by introducing computer chips considerably faster with a significantly lower energy usage - and the Apple community can't get their hands down.

Why is speed so important? Computers are essentially rather dumb, and it's, of course, the software and the applications that are important: the faster, the more memory, the more you can do with computers. In the '60s, most companies used mainframes and dumb terminals to access the software. With fast Internet, computers/servers, it's funny to see how processing is moving back to servers, but this time not to the servers in our 'basements,' but to the cloud.

Centralized CPU and memory enable new services. Some were impossible to imagine just a few years ago. As Negroponte (not Benny Landa) said, everything that can be digital will be digital. With Moore's law, everything (relating to computer power) is doubled every 18th month, and finally, Chris Anderson's The Long Tail proves this to be amazingly true. See how music has developed from physical media to online. See how Netflix, Disney+, HBO, and more streaming services have made DVDs and other physical media entirely obsolete.

For gamers, Origin, Steam, Epic Games, and many more are now selling and distributing games, and entirely new business models have been created. The next step is even to game remotely, so the games (which require CPU and GPU like never before) can be streamed via the Internet in real-time.

Technology has enabled new business models like subscriptions and pay-per-use, and we have only yet seen the tip of the iceberg. When the Internet combined with more robust infrastructure, more and more IP addresses were needed. With an updated IPv protocol (currently IPv6), the number of IP addresses was updated, so it could be added to all connected devices from coffee brewers and computers to printing-binding and all the machines we are using in the graphic arts environment. IoT, or the Internet of Things, connects our devices and opens up for further planning, maintenance, and a significant level of data exchange, which I am NOT so sure all PSPs are aware of!

Shared CPU and memory have now opened for applications that would have been impossible just a few years ago. I already mentioned gaming, but now video editing and AI are becoming available as services. You only need dumb terminals to play the most CPU and GPU-intense games, enabling the craziest video editing features and even AI as a service beyond your imagination!

The AI side of this equation is the direct reason for me to write this article. Last Friday, my son had friends over for dinner, and they spoke about ChatGPT. Chatbots have existed for a long, but this one is particularly clever and able to write a thesis and is now used by young people instead of writing and researching themselves. The conversation was about whether it is OK to use technology like ChatGPT, and the obvious answer is no! Some people - and it was an argument - will argue that ChatGPT is nothing but an advanced spell control. My point in the discussion is that schools may (at least in Denmark) become a cheque list over deliveries rather than a place to learn. Learning is not about learning how to use MS Word, ChatGPT, or any other tool but about understanding the relations between what you read, how you formulate yourself, and how to solve problems. Schools have, in some countries, focused way too much on teaching how to use the tools rather than how they are used to express whatever you've learned!

One of the things AI is used to, besides the above example, is almost like science fiction films - where you can talk to your computer and make it do things. Many are using SIRI and Google to search, buy, add tasks, and so forth, but with the latest updates, complex tasks can be done using AI. An example is the online film editor runway. Here you can edit videos just using voice commands; you can even use the AI to extend the background of a film, so it adds a background to a film or photo that has never existed. The voice commands are used to color grade, remove items, to practically do things that even your fastest new desktop computer won't have enough CPU and Memory to perform. Runway utilizes AI, server-side performance, the Internet for two-way delivery of data, and of course, a subscription model that scales with your needs!

More and more will be possible in the future, and one of the more challenging questions we may need to answer is; do we want all this? I am very skeptical of Metaverse and its VR. As fascinating as it is from a technical perspective, the more scared I find it when it comes to seeing your kids living in a non-existing world. Do we really want them to have an even more altered reality?

I have made up my mind and won't buy any VR glasses or goggles - at least not for entertainment. I see all the advantages in professional life, from doctors performing remote surgery to helping printers perform service with a remote technician's help. And yes, I am maybe just not open-minded enough about the technology. I admit. However, I must say that we must consider that not all possible things are always good.

Let's be open-minded and skeptical simultaneously, and let us control development rather than just letting the market make all the decisions!

So, where are we heading?
Computers will continue to get faster and faster. Developing applications that are crazy smart will also continue, and in the future, all the things we see in science fiction films may be how our future looks. The cost of the development CAN be a lack of democracy, and less educated populations, despite having more access to everything than ever. We may need to rethink our living to see the future from that perspective. Some people have started looking into Democracy 2.0, which is very interesting and different, but that goes for another story!

Add/View comments for this article →
0 Comments
user

Wed March 13th

Trends in packaging con...

4 trends are emerging in 2024: AI, Sustainability, Automation and Talent.

Fri March 8th

Financing

Let's take a deep dive into financing and our very own CEO, Henrik Klem Lassen, who has a Master's Degree in Economics.

Mon March 4th

Hiflow solutions releas...

HiFlow Solutions releases free APP with packaging & imposition for estimators.

Thu February 29th

Printvis announces inte...

PrintVis, the business management MIS/ERP for the printing industry, announces its integration with LoyaltyLoop

Wed February 21st

Will you pay for ai-gen...

With AI, ChatGPT and Language Models it becomes more and more difficult to differentiate human-made from AI - does AI content have value?

Thu February 15th

How to get the most out...

You bought a system a while ago, how do you know you are getting your money's worth?

Wed February 14th

Designnbuy launches des...

Supercharge Your Web-to-Print Business with Unprecedented Personalization, Design Flexibility, and Efficiency

Sun February 11th

Graphic village on the ...

The recent Printing Impressions Top 300 has been an exciting study of how we see value in our industry.

Tue February 6th

Designo v1.8: revolutio...

Join Design’N’Buy for a FREE webinar and discover how their revolutionary technology DesignO V1.8 can transform your print business!

Fri January 19th

Ray stasieczko is a thr...

Keep up the good work, and accept that a stronger end-of-the-day-with-Ray will challenge you on many levels