Editor’s picks: 2024’s most exciting technology advancements

In the last 12 months, we have seen significant advances in generation areas, from electric cars to combined scientific technologies, but much of the debate has focused on synthetic intelligence (AI).

As large language models (the existing gold standard, based on the neural networks that force everything from Windows Copilot to ChatGPT) have been gradually advancing in 2024, this is the year in which the existential dangers of language AI have become eerily clear.

Another area poised for radical transformation is quantum computing, where new advances are reported every month. Not only are the machines getting bigger and more powerful, they are also becoming more reliable, as scientists move closer to machines that outperform the most productive supercomputers. Some of the most important advances have been in the area of ​​error correction, a key challenge that will need to be solved before quantum computers can realize their potential.

And in the world of electronics, scientists edged closer to realizing a hypothetical component known as “universal memory,” which, if achieved, will transform the devices we use daily.

Here are the most transformative tech developments of 2024.

This year, AI corporations have released advanced language models, adding OpenAI’s o1, the Evo genetic mutation prediction model, and the ESM3 protein sequencing model. We’ve also noticed increased education in AI and processing methods, such as a new tool that makes symbol generation up to 8x faster and a set of rules that can compress models to be small enough to run locally on your smartphone.

Related: Humanity faces a ‘catastrophic’ long-term if we don’t use AI, says Yoshua Bengio, the ‘godfather of AI’

But this was also the year that the existential threats associated with AI came into sharp focus. In January, a study showed that widely used safety training methods failed to remove malicious behavior in models that had been “poisoned,” or engineered to display harmful or undesirable tendencies.

The study, described by its authors as “legitimately scary,” found that in one case, a rogue AI learned to recognize the trigger for its malicious actions and thus tried to hide its antisocial behavior from its human handlers. They could see what the AI was really “thinking” the whole time, of course, but this wouldn’t always be the case in the real world.

It was a busy 12 months in quantum computing research. In January, quantum computing company QuEra created a new machine with 256 physical qubits and 10 “logical qubits” — collections of physical qubits tied together through quantum entanglement — that reduces errors by storing the same data in different places. At the time, this was the first machine with built-in quantum error correction. But teams worldwide are trying to reduce the error rate in qubits.

The historic progression in error type was unveiled in December, when Google scientists announced that they had built a new generation of quantum processing arrays (QPUs) that took a first step forward in error type, where, at As the number of qubits increases, you are right, you make more errors than you introduce. This will result in exponential alleviation of errors as the number of entangled qubits increases.

The new 105-qubit Willow chip, successor to Sycamore, has achieved a surprising benchmark result, solving a challenge in five minutes that would have taken a supercomputer 10. 7 billion years to solve – a quadrillion times the age of the universe.

While this year brought several innovative computer components — including a new type of data storage that can withstand extreme heat, as well as a DNA-infused computer chip — some of the biggest advancements came in the development of “universal memory.” This is a type of component that will dramatically increase the speed of computing and reduce energy consumption.

All computers use two types of memory at once: temporary memory, such as random access memory (RAM), and long-term storage, such as solid-state drives (SSD) or flash memory. RAM is incredibly fast but requires consistent power; All memory stored in RAM is deleted when the computer is turned off. SSDs, on the other hand, are slow but can store data without power.

Universal memory is a third type of memory that combines the best of the first two kinds — and, in 2024, scientists inched closer to realizing this technology.

At the start of the year, scientists showed that a new material dubbed “GST467” was a viable candidate for phase-change memory — a type of memory that creates 1s and 0s of computing data when it switches between high- and low-resistance states in a glass-like material. When it crystallizes, it represents 1 and releases a large amount of energy. When it melts, it represents 0 and absorbs the same amount of energy. In testing, this material proved faster and more efficient than other candidates for universal memory, such as ULTRARAM, the current leading candidate.

—Mathematicians devised novel problems to challenge advanced AIs’ reasoning skills — and they failed almost every test

—New quantum computing milestone smashes entanglement world record

—’Crazy idea’ memory device could slash AI energy consumption by up to 2,500 times

Other candidates are also promising — and bizarre. In April, for example, scientists proposed that a weird magnetic quasiparticle known as a “skyrmion” may one day be used in universal memory instead of electrons. In the new study, they sped up skyrmions from their normal speeds of 100 meters per second (roughly 225 mph, or 362 km/h) — which is too slow to be used in computing memory — to 2,000 mph (3,200 km/h).

Then, towards the end of the year, scientists discovered other curtains that can also be used for phase change memory. This reduced the energy needs for data storage by up to a billion times. This discovery occurred entirely by chance, proving that in the world of science and technology, you may never know how close you are to a first breakthrough.

Keumars is the generation editor of Live Science. He has written for publications such as ITPro, The Week Digital, ComputerActive, The Independent, The Observer, Metro and TechRadar Pro. He has worked as a generation journalist for over five years and in the past served as features editor at ITPro. He is an NCTJ-qualified journalist and has a degree in biomedical sciences from Queen Mary University of London. He is also registered as a Founder Chartered Manager with the Chartered Management Institute (CMI), having qualified as a Level 3 Team Leader with Distinction in 2023.

Swift SW200DL Compound Monocular Microscope Review

Vortex Solo R/T 8×36 monocular review

World’s fastest supercomputer ‘El Capitan’ goes online — it will be used to secure the US nuclear stockpile and in other classified research

Live Science is from Future US Inc, a leading foreign media organization and virtual publisher. Visit our corporate site.

Leave a Comment

Your email address will not be published. Required fields are marked *