The Top Programming Languages 2024

Typescript and Rust are among the rising stars

3 min read

Stephen Cass is the special projects editor at IEEE Spectrum.

Welcome to IEEE Spectrum’s 11th annual rankings of the most popular programming languages. As always, we combine multiple metrics from different sources to create three meta rankings. The “Spectrum” ranking is weighted towards the profile of the typical IEEE member, the “Trending” ranking seeks to spot languages that are in the zeitgeist, and the “Jobs” ranking measures what employers are looking for.

You can find a full breakdown of our methodology here, but let’s jump into our results. At the top, Python continues to cement its overall dominance, buoyed by things like popular libraries for hot fields such as AI as well as its pedagogical prominence. (For most students today, if they learn one programming language in school, it’s Python.) Python’s pretty popular with employers too, although there its lead over other general purpose languages is not as large and, like last year, it plays second fiddle to the database query language SQL, which employers like to see paired with another language. SQL popularity with employers is a natural extension of today’s emphasis on networked and cloud-based system architectures, where databases become the natural repository for all the bytes a program’s logic is chewing on.

Top Programming Languages 2024 is brought to you by the IEEE Computer Society. Get connected with the world’s largest community empowering computer science and engineering professionals.

Stalwarts like Java, Javascript, and C++ also retain high rankings, but it’s what’s going on a little further down that’s particularly interesting. Typescript—a superset of Javascript—moves up several places on all the rankings, especially for Jobs, where it climbs to fourth place, versus 11th last year. Typescript’s primary differentiator over Javascript is that it enforces static typing of variables, where the type of a variable—integer, floating point, text, and so forth—must be declared before it can be used. This allows for more error checking when Typescript programs are compiled to Javascript, and the increase in reliability has proven appealing.

Another climber is Rust, a language aimed at creating system software, like C or C++. But unlike those two languages, Rust is “memory safe”, meaning it uses a variety of techniques to ensure programs can’t write to locations in memory that they are not supposed to. Such errors are a major source of security vulnerabilities. Rust’s profile has been rising sharply, boosted by things like a February cybersecurity report from the White House calling for memory safe languages to replace C and and C++. Indeed, C’s popularity appears to be on the wane, falling from fourth to ninth place on the Spectrum ranking and from 7th to 13th on the Jobs ranking.

Two languages have entered the rankings for the first time: Apex and Solidity. Apex is designed for building business applications that use a Salesforce server as a back end, and Solidity is designed for creating smart contracts on the Ethereum blockchain.

This year also saw several languages drop out of the rankings. This doesn’t mean a language is completely dead, it just means that these languages’ signal is too weak to allow them to be meaningfully ranked. Languages that dropped out included Forth, a personal favorite of mine that’s still popular with folks building 8-bit retro systems because of its tiny footprint. A weak signal is also why we haven’t included some buzzy languages such as Zig, although those proficient in it can apparently command some high salaries.

As these other languages come and go from the rankings, I have to give the shout out to the immortals, Fortran and Cobol. Although they are around 65 years old, you can still find employers looking for programmers in both. For Fortran, this tends to be for a select group of people who are also comfortable with high-energy physics, especially the kind of high-energy physics that goes boom (and with the security clearances to match). Cobol is more broadly in demand, as many government and financial systems still rely on decades-old infrastructure—and the recent paralyzing impact of the Cloudstrike/Microsoft Windows outage incident probably hasn’t done much to encourage their replacement!

The Conversation (10)
Sort by
ADBEEL PEREZ MARTINEZ
ADBEEL PEREZ MARTINEZ16 Nov, 2024
INDV

Can I use your charts as part of one power point presentation for a training for software engineering?

Robert Matthew Lore
Robert Matthew Lore22 Jan, 2025
INDV

What about Microsoft's VBA used in Excel / Word / Outlook ? Visual basic is also very popular in the office environment.

GoSimon Sunatori
GoSimon Sunatori22 Oct, 2024
LS

Please consider LiveCode <https://livecode.com/> with an interpretive script programming language.

The LiveCode language is designed to be expressive, readable, memorable and as close as possible to the way you speak and think. That means you write up to 90 % less code compared to traditional languages.

Show more comments
Pixel art trophies and medals on a blue background, with various shapes and designs.
istock
Orange

This article is part of The Scale Issue.

Longest Continuously Operating Electronic Computer

Voyager 1 and its twin space probe, both launched by NASA in 1977, were the first human-made objects to reach interstellar space. But that’s not the only record the spacecraft hold. Voyager 2’s Computer Command System has not been turned off since it first booted up about 48 years ago, making it the longest continuously operating electronic computer.

Quietest Place on Earth

Can you hear your own heartbeat? For most of us, the answer is no—unless you’re standing in Orfield Laboratories’ anechoic chamber, in which case, you might be able to hear the blood rushing through your veins and the sound of your own blinking, too. The chamber in Minneapolis holds the title for quietest place on earth, with a background noise reading of –24.9 A-weighted decibels—meaning that the ambient sound is far below the threshold of human hearing.

Longest-Lasting Battery

An experimental electric bell at the University of Oxford, in England, has been ringing nearly continuously for 185 years. Powered by two dry piles—an early type of battery—connected in series, the bell has rung more than 10 billion times since it was set up in 1840. Its ringing, however, is now barely audible beneath the glass bell jar protecting the experiment.

Fastest Typing Using Brain Signals

For people with certain neurodegenerative conditions that impact muscle control, communication can be difficult. Brain–computer interfaces offer a solution by directly translating brain waves to text. But until recently, that translation has been slow. In 2022, researchers at the University of California, San Francisco, set the record for the fastest communication via brain signals: 78 words per minute.

Best-Selling Consumer Electronics

Certain consumer electronics, like the iPhone, seem ubiquitous. Over 18 years and about as many generations, more than 2.3 billion Apple smartphones have been sold. But when you break it down to individual models, which devices have been the biggest success? See how some particularly popular devices compare.

Strongest Magnetic Field on Earth

At least among magnets that don’t explode from their own field strength, the U.S. National High Magnetic Field Laboratory’s Pulsed Field Facility holds the record for strongest magnetic field on earth. The 100-Tesla field, which is about 2 million times as strong as Earth’s magnetic field, can be turned on for 15 milliseconds just once an hour.

Biggest Teatime Electricity Spike

Brits love their tea. That’s why the United Kingdom’s National Grid engineers have to manage surges in energy use during popular broadcast events, when many viewers put their kettles on simultaneously. The biggest spike occurred during the 1990 World Cup semifinal. Just after England lost the game-deciding penalty shootout, demand surged by 2,800 megawatts, equivalent to the electricity used by approximately 1.1 million kettles.

Strongest Robotic Arm

In March, Rise Robotics celebrated the Beltdraulic SuperJammer Arm’s setting of the Guinness World Record for Strongest Robotic Arm Prototype. A collaboration between Rise and the U.S. Air Force, the arm lifted an astonishing 3,182 kilograms, about the weight of an adult female African elephant. Unlike other heavy-lifting machines, the robot uses no hydraulics, only electric power, and it improves efficiency by generating electricity when it’s lowering a load.

Smallest Pacemaker

Implanting most pacemakers requires invasive surgeries. But a group of researchers at Northwestern University, in Evanston, Ill., has developed a device that can be implanted through the tip of a syringe. Measuring 3.5 millimeters in its largest dimension and suited for newborns with heart defects, the pacemaker—which is designed for patients who need only temporary pacing—safely dissolves in the body after it has done its job.

Fastest Data Transfer

Earlier this year, a team from the National Institute of Information and Communications Technology and Sumitomo Electric, in Japan, blasted a record 1.02 million billion bits (petabits) across 1,808 kilometers in one second, or 1.86 exabits per second-kilometer. At that rate, in one second, you could send everything everyone in the world watched on Netflix in the first half of this year from Tokyo to Shanghai 4,000 times. A special 19-core optical fiber made it possible.

Fastest EV Charging

The Chinese automaker BYD used a new fast-charging system that peaked at 1,002 kilowatts and added 421 kilometers of range to a Han L sedan in under five minutes. That’s about 84 kilometers per minute. Among the key innovations behind the feat: 1,500-volt silicon carbide transistors and lithium iron phosphate batteries with half the internal resistance of their predecessors.

Keep Reading ↓ Show less

The Future of the Grid: Simulation-Driven Optimization

Multiphysics simulation provides critical insight into complex power grid components

5 min read
Powerline tower on field; energetic aura and electric flow against a vivid blue sky
Lightning strikes a tower\u2019s shielded wires. The induced voltage on the three-phase conductors is computed using electromagnetic field analysis.
COMSOL

This is a sponsored article brought to you by COMSOL.

Simulation software is useful in the analysis of new designs for improving power grid resilience, ensuring efficient and reliable power distribution, and developing components that integrate alternative energy sources, such as nuclear fusion and renewables. The ability to simulate multiple physical phenomena in a unified modeling environment gives engineers a deeper understanding of how different components of the grid interact with and affect each other.

For example, when designing the various components of grid infrastructure, such as transformers and transmission lines, multiphysics electromagnetic field analysis is essential for ensuring the safety of the surrounding individuals and environment. Understanding thermal behavior, another phenomenon involving multiple physics, is equally necessary for the design of grid components where heat dissipation and thermal stresses can significantly affect performance and lifespan. Structural and acoustics simulation, meanwhile, is used to predict and mitigate issues like transformer vibration and noise — an important practice for ensuring the longevity and reliability of grid components.

Multiphysics simulation provides critical insight into the complex interactions at play within power grid components, enabling engineers to virtually test and optimize future grid designs.

Electric breakdown and corona discharge analyses are particularly vital for high-voltage transmission lines, as such phenomena can compromise the performance of their insulation systems. Simulation allows development teams to predict where such events are likely to happen, enhancing the design of insulators and other components where the goal is to minimize energy loss and material degradation.

As a real-world example, one leading manufacturer uses the COMSOL Multiphysics® simulation software software to develop magnetic couplings, a noncontact alternative to mechanical transmission that enables power transfer without the inherent friction-based limitations of continual contact. While the advantage of friction-free power transmission means that magnetic couplings have found applications in a broad range of technologies, including offshore wind turbines, these systems must be developed carefully to avoid degradation. By employing highly nonlinear hysteresis curves and applying its own material temperature dependences for magnetic loading, the manufacturer’s development team has successfully used multiphysics simulation to help prevent the permanent magnets from reaching critical temperatures, which can cause irreversible demagnetization and compromise the reliability of the designs. Additionally, due to the diverse nature of use cases for magnetic couplings, the company’s design engineers must be able to interchange shapes and materials of magnets to meet customer requirements without building costly and time-consuming prototypes — rendering multiphysics simulation a powerful approach for characterizing configurations, providing virtual prototypes of their designs, and ultimately reducing the price for customers while remaining vigilant on fine details.

These examples show just a few of the ways that coupling multiple interacting physics within a single model can lead to successful simulation of real-world phenomena and thereby provide insights into current and future designs.

Lightning strikes a tower’s shielded wires. The induced voltage on the three-phase conductors is computed using electromagnetic field analysis.COMSOL

Improving Reliability with Digital Twins & Simulation Apps

Engineering teams can also use simulation technology to create more efficient, effective, and sustainable power grids by creating digital twins. A digital twin contains a high-fidelity description of a physical product, device, or process — from the microscopic to the macroscopic level — that closely mirrors its real counterpart. For every application, the digital twin is continuously receiving information, ensuring an up-to-date and accurate representation.

With this technology, grid operators and their equipment suppliers can predict which components are most likely to fail, enabling them to schedule maintenance and replacement more efficiently and thereby improving grid reliability. Digital twins can be made for equipment ranging from power sources including solar cells and wind turbines to power distribution systems and battery energy storage.


An offshore wind farm where lightning strikes one of the turbine blades. The electric field on the turbine towers, seawater, and seabed is shown.COMSOL

The most recent modeling and simulation technology provides power and energy companies with tools for creating digital twins in the form of standalone simulation apps, which significantly increases the number of users who have access to advanced simulation technology. By including only relevant functionality in a standalone simulation app, colleagues with no modeling and simulation experience can utilize this technology without needing guidance from the modeling specialist. Furthermore, the use of data-driven surrogate models in simulation apps enables near-instantaneous evaluation of what would otherwise be time-consuming simulations — which means that simulation technology can now be used in a real-world setting.

Digital twins, in the form of standalone apps, bring the power of simulation to the field, where grid operators can utilize real-time performance information to ensure grid reliability.

For instance, one organization that works with local power companies to analyze equipment maintenance and failure built a custom app based on a multiphysics model it had developed to predict cable faults and improve troubleshooting efficiency. While engineers have been utilizing simulation in labs for decades, cable failure occurs in the field, and onsite troubleshooting personnel are responsible for assessing these failure conditions. With this in mind, an engineer at the organization developed the simulation app using the Application Builder in COMSOL Multiphysics®.

Temperature distribution in a battery energy storage system (BESS).COMSOL

The app features relevant parameters that troubleshooting personnel with no prior simulation experience can easily modify. Field technicians enter cable data and select the type of fault, which modifies the multiphysics model in real time, allowing the app to evaluate and output the data necessary to understand the condition that led to the fault. The app then produces a reported potential and electric field, which leads the technicians to an informed decision regarding whether they need to replace or repair the cable. Following the app’s successful deployment, the engineer who developed it stated, “The simulation app plays a key role in cable maintenance. It makes the work of our field technicians more efficient by empowering them to confidently assess and repair faults.”

Routine physical tests of grid equipment cannot fully reflect conditions or determine failure types in many situations, as a large number of complex factors must be considered, such as cable structure and material, impurities in the cable, voltage fluctuation, and operating conditions and environments. As a result, simulation has proven to be indispensable in many cases for collecting accurate cable health assessments — and now in the form of custom apps, it is more accessible than ever.

Generating Nuclear Solutions

Simulation has also been heavily integrated into the design process of various components related to the nuclear industry. For example, simulation was used to help design generator circuit breakers (GCBs) for nuclear power plants. GCBs must be reliable and able to maintain performance even after long periods of inactivity. The COMSOL Multiphysics® software can be used to improve the current-carrying capacity of the GCBs, which can offer protection from current surges and provide dependable electricity generation.

The design of nuclear fusion machines like tokamaks has also benefitted from the use of simulation. These devices must be able to withstand high heat fluxes and plasma disruptions. COMSOL Multiphysics® has been used to help engineers predict the effects of these problems and come up with design solutions, such as adding a structural support system that can help reduce stress and survive challenging conditions.

Engineering the Grid of Tomorrow

The development of next-generation power grid systems is a complex and dynamic process that requires safe, reliable, and affordable testing. Multiphysics simulation technology can play a major role in future innovations for this industry, enabling engineers to anticipate and analyze the complex interactions happening inside these devices while building upon the existing infrastructure to address the demands of modern-day consumption.

COMSOL Multiphysics is a registered trademark of COMSOL AB.

Keep Reading ↓ Show less

The NEC-Approved Solution That’s Changing How Fleets Approach EV Charging

Overcome the biggest barriers to fleet electrification

1 min read

Maximize existing grid capacity and avoid costly upgrades. Learn how Automated Load Management enables faster, more affordable fleet electrification. Download the free technical guide.

Download this free whitepaper now!

Power Sector Faces Engineer Shortage

Will there be enough engineers to meet growing energy demands?

5 min read
An Airman rewires a building's electrical system while wearing thick protective gloves.

A U.S. airman rewires a building’s electrical system during a Deployment for Training at Aviano Air Base in Italy.

Victoria Jewett/U.S. Air National Guard

As renewable energy scales up and data centers demand more electricity than ever, the electrical systems that power the modern world face mounting stress. At the heart of this infrastructure are power engineers: technical professionals who design, build, and maintain the power grid. But just as demand for electricity skyrockets, a shortage in talent now threatens to derail the energy sector.

According to a recent joint study by consulting firm Kearney and IEEE, the global power sector will need between 450,000 and 1.5 million more engineers by 2030 to build, implement, and operate energy infrastructure. Already, 40 percent of power executives report difficulty hiring skilled workers, citing talent competition and insufficient skills as key barriers.

That gap is a major concern for energy leaders. Some say that a talent shortage could threaten to delay critical infrastructure upgrades just when the grid needs to evolve faster than ever. More frequent service disruptions could follow as extreme weather events strain aging grid equipment, and energy demands continue to grow. Hiring bottlenecks could also stall timelines for clean-energy integration, transmission expansions, and nuclear-plant maintenance, jeopardizing both grid reliability and the pace of decarbonization.

In response, utilities and engineering firms are exploring new strategies to bolster the pipeline, including partnering with universities, investing in apprenticeship programs, and turning to emerging technologies to offset gaps in the workforce.

“Without enough engineers, these critical projects will be delayed, compounding reliability risks and slowing the energy transition at the exact moment when momentum is needed most,” Andre Begosso, a partner at Kearney involved in the study, told IEEE Spectrum.

A “Perfect Storm” for a Talent Shortage

The root causes of the engineering shortage are mounting.

Baby boomers in the workforce are retiring, and not enough young people are stepping in to fill their shoes, leaving utilities and engineering firms with more open roles than people to fill them. “Right now, utility companies are facing a perfect storm of a labor crisis, with an aging workforce and a lack of younger employees to replenish them,” says Kevin Miller, chief technology officer for North America at IFS, a software company with energy-sector clients.

Tough working conditions are also pushing engineers out of the field. “Nearly half of all power engineers changed jobs, employers, or left the industry in just the past three years, with burnout and lack of creative problem-solving opportunities among the top reasons,” Begosso says. In nuclear power, where reliability is crucial, Begosso says the churn is even higher: 58 percent of engineers have moved roles.

On the education front, the pipeline is drying up. Begosso says that university enrollment in power engineering programs has stagnated as engineering students flock to high-tech fields like data science, software engineering, and AI, which are seen as more exciting and lucrative.

Even when companies find new hires, they’re slow to onboard. “Hiring and onboarding these skilled employees takes longer than in other sectors,” says Miller. Manual practices and long training cycles stretch timelines, he says, and when errors occur, supervisors must divert attention away from core operations.

Without enough power engineers, substations may not receive the maintenance and upgrades they need.Sirathee Boonpanyarak/Alamy

How Engineering Firms Are Managing the Gap

At Black & Veatch, one of the largest engineering firms in the United States, early-career recruitment is crucial for filling staffing needs. The company turns 85 to 90 percent of its interns into entry-level hires, and last summer, that rate hit 93 percent, according to Ryan Elbert, executive vice president and global director of engineering and development services at Black & Veatch. He says the firm invests heavily in growing its own talent pool, hiring interns and recent graduates to fill about 10 to 15 percent of their engineering teams.

But even with this strategy Black & Veatch isn’t immune. “I think the talent shortage really plays out a lot more at the experienced level,” Elbert says. “Hiring top-tier seasoned engineers is really no small feat.”

Smaller firms are especially feeling the heat. With limited resources, they often find themselves in direct competition with industry giants that can offer higher pay and better perks.

Heather Eason, CEO of Select Power Systems, a boutique engineering firm, recounts how a junior engineer—hired straight out of college and trained in substation design—was poached by a competitor within six months for just US $5 more an hour. Particularly for younger workers, “money speaks,” Eason says.

Many large firms can plan for staffing needs long before a project begins, but smaller firms don’t always have that luxury. Eason says that Select Power typically spends 6 to 9 months filling roles. So when a key engineer left Select for a larger firm, the company had to walk away from a major transmission line project. “I’m not going to be able to put a T-line [transmission line] team together in eight weeks,” she says.

Obtaining the credentials needed to become a licensed professional engineer could exacerbate the talent gap. Eason says that becoming a licensed professional engineer requires a bachelor’s degree in electrical engineering, completion of the Engineer-in-Training (EIT) and Professional Engineering (PE) exams, and four years of work experience in between those tests. While engineers don’t need to hold these certifications to be hired, having them on their résumés provides engineers with the credentials employers look for when hiring for better-paying jobs with more responsibility.

That lengthy process could deter engineers from taking the exams, potentially stifling their career development, especially for women who may have additional caretaking responsibilities. “I never got my PE registration, and it has definitely limited me,” Eason says. “I didn’t have the ability to just stop being a mom to four kids so I could study for six months for an 8-hour exam.”

Participants receive training during the 2025 International Hybrid Power Plants & Systems Workshop in Mariehamn, Åland.Energynautics

Tech, Training, and Transfer of Knowledge

Rather than chasing an ever-shrinking pool of engineers, some utilities are using technology to get more out of the staff they already have. For instance, IFS deploys augmented reality tools designed to help senior technicians assist their less experienced colleagues in troubleshooting problems in real time.

Companies in need of talent are also investing in upskilling initiatives. At Power Academy, a training program for utility workers run by global engineering firm TRC Companies, director of technical services Anna Campbell says demand from utilities and data centers is growing fast. Those clients require expertise in protection, controls, and substation engineering—skills learned as part of the Power Academy training program. “There simply isn’t enough talent to meet the need,” Campbell told Spectrum.

In the United Kingdom, Excitation & Engineering Services (EES)—whose clients include ConocoPhillips, General Electric, and Siemens—has structured its graduate recruitment program to ensure young engineers work alongside seasoned employees from day one. “It shortens the learning curve but, most importantly, it passes on that knowledge from senior engineers while it’s still available,” EES director Ryan Kavanagh says.

Meanwhile, programs such as the education division of Bentley Systems, in Exton, Pa., offer university students free access to engineering software, online courses, and global competitions that prepare them for engineering jobs in the energy sector. “The more we can embed innovation deeper into education, the faster we can deliver on sustainable, resilient infrastructure projects,” says Chris Bradshaw, chief sustainability and education officer at Bentley, which is an infrastructure engineering software firm.

Looking ahead, university initiatives that encourage engineering students to pursue jobs in the energy sector, combined with the use of technologies like generative AI and cultural shifts, mark steps toward easing the talent crunch. But the clock is ticking for utilities and engineering firms to address the talent gap before it’s too late.

“If the gap persists, the industry simply won’t be able to deliver on its potential,” says Begosso. “The strength of the energy workforce will go a long way in determining how competitive and reliable the power sector, and the economy it underpins, can be in the next decade.”

Keep Reading ↓ Show less

Get the latest technology news in your inbox

Subscribe to IEEE Spectrum’s newsletters by selecting from the list.

How Genome Sequencing Is Transforming Biology

The Earth BioGenome Project is a bioinformatics blockbuster

11 min read
A young man kneels while attempting to capture, in a plastic vial, a moth perched on a cylindrical, tentlike structure lit from within by purple light.

On an alpine mountain trail above Malles Venosta, Italy, Teo Valentino, a Ph.D. student in biology at the University of Neuchâtel, in Switzerland, captured a moth lured to a light trap. Samples taken from the moth were sent to Cambridgeshire, England, so that the creature’s genome could be sequenced. That sequence will eventually be added to a database maintained by the Earth BioGenome Project, which aims to sequence the genome of every species of plant, animal, fungus, and many other organisms.

Luigi Avantaggiato
Orange

A gibbous moon hangs over a lonely mountain trail in the Italian Alps, above the village of Malles Venosta, whose lights dot the valley below. Benjamin Wiesmair stands next to a moth trap as tall as he is, his face, bushy beard, and hair bun lit by its purple glow. He’s wearing a headlamp, a dusty and battered smartwatch, cargo shorts, and a blue zip sweater with the sleeves pulled up. Countless moths beat frenetically around the trap’s white, diaphanous panels, which are swaying with ghostly ripples in a gentle breeze. Wiesmair squints at his smartphone, which is logged on to a database of European moth species.

Chersotis multangula,” he says.

“Yes, we need that,” comes the crisp reply from Clara Spilker, consulting a laptop.

This article is part of The Scale Issue.

Wiesmair, an entomologist at the Tyrolean State Museums, in Innsbruck, Austria, and Spilker, a technical assistant at the Senckenberg German Entomological Institute, in Müncheberg, are taking part in one of the most far-reaching biological initiatives ever: obtaining a genome sequence for nearly every named species of eukaryotic organism on the planet. All 1.8 million of them. The researchers are part of an expedition for Project Psyche, which is sampling European butterflies and moths and will feed its data into the global initiative, called the Earth BioGenome Project (EBP).

Entomologist Benjamin Wiesmair [at right] uses his smartphone to consult a Lepidoptera database to identify the species of moths captured during a trapping session on an alpine trail above Malles Venosta, Italy. Clara Spilker and Alena Sucháčková [middle] consult a table to determine whether the species are needed for genome sequencing.

Luigi Avantaggiato

Eukaryotes are organisms whose cells contain a nucleus. From protozoa to human beings, all have the same basic biological mechanism for building, maintaining, and propagating their form of life: a genome. It’s the sum total of the genes carried by the creature.

Twenty-two years ago, researchers announced that for the first time they had mapped, or “sequenced,” nearly all of the genes in a human genome. The project cost more than US $3 billion and took 13 years, but it eventually transformed medical practice. In the new era of genomic medicine, doctors can take a patient’s specific genetic makeup into consideration during diagnosis and treatment.

Many moths, attracted to the ultraviolet lights, were captured during a light-trapping excursion near Malles Venosta, Italy.

Luigi Avantaggiato

The EBP aims to reach its monumental goal by 2035. As of July 2024, its tally of genomes sequenced stood at about 4,200. Success will undoubtedly depend on researchers’ ability to scale several biotech technologies.

“We need to scale, from where we’re at, more than a hundredfold in terms of the number of genomes per year that we’re producing worldwide,” says Harris Lewin, who leads the EBP and is a professor and genetics researcher at Arizona State University.

One of the most crucial technologies that must be scaled is a technique called long-read genome sequencing. Specialists on the front lines of the genomic revolution in biology are confident that such scaling will be possible, their conviction coming in part from past experience. “Compared to 2001,” when the Human Genome Project was nearing completion, “it is now approximately 500,000 times cheaper to sequence DNA,” says Steven Salzberg, a Bloomberg Distinguished Professor at Johns Hopkins University and director of the school’s Center for Computational Biology. “And it is also about 500,000 times faster to sequence,” he adds. “That is the scale, over the past 25 years, a scale of acceleration that has vastly outstripped any improvements in computational technology, either in memory or speed of processors.”

A lepidopterist wrote identifying information on a label affixed to a specimen jar containing a moth captured during a light-trapping excursion near Malles Venosta, Italy.

Luigi Avantaggiato

There are many reasons to cheer on the EBP and the technological advances that will underpin it. Having established a genome for every eukaryotic creature, researchers will gain deep new insights into the connections among the threads in Earth’s web of life, and into how evolution proceeded for its myriad life forms. That knowledge will become increasingly important as climate change alters the ecosystems on which all of those creatures, including us, depend.

And although the project is a scientific collaboration, it could spin off sizable financial windfalls. Many drugs, enzymes, catalysts, and other chemicals of incalculable value were first identified in natural samples. Researchers expect many more to be discovered in the process of identifying, in effect, each of the billions of eukaryotic genes on Earth, many of which encode a protein of some kind.

“One idea is that by looking at plants, which have all sorts of chemicals, often which they make in order to fight off insects or pests, we might find new molecules that are going to be important drugs,” says Richard Durbin, professor of genetics at the University of Cambridge and a veteran of several genome sequencing initiatives. The immunosuppressant and cancer drug rapamycin, to cite just one of countless examples, came from a microbe genome.

Your Genes Are a Big Reason Why You’re You

The EBP is an umbrella organization for some 60 projects (and counting) that are sequencing species in either a region or in a particular taxonomic group. The overachiever is the Darwin Tree of Life Project, which is sequencing all species in Britain and Ireland, and has contributed about half of all of the genomes recorded by the EBP so far. Project Psyche was spun out of the Darwin Tree of Life initiative, and both have received generous support from the Wellcome Trust.

To get an idea of the magnitude of the overall EBP, consider what it takes to sequence a species. First, an organism must be found or captured and sampled, of course. That’s what brought Wiesmair, Spilker, and 41 other lepidopterists to the Italian Alps for the Project Psyche expedition this past July. Over five days, they collected more than 200 new species for sequencing, which will augment the 1,000 finished Lepidoptera genome sequences already completed and the roughly 2,000 samples awaiting sequencing. There’s still plenty of work to be done; there are around 11,000 species of moths and butterflies across Europe and Britain.

After sampling, genetic material—the creature’s DNA—is collected from cells and then broken up into fragments that are short enough to be read by the sequencing machines. After sequencing, the genome data is analyzed to determine where the genes are and, if possible, what they do.

Over the past 25 years, the acceleration of gene-sequencing tech has vastly outstripped any improvements in computational technology, either in memory or speed of processors.

DNA is a molecule whose structure is the famous double helix. It resides in the nucleus of every cell in the body of every living thing. If you think of the molecule as a twisted ladder, the rungs of the ladder are formed by pairs of chemical units called bases. There are four different bases: adenine (A), guanine (G), cytosine (C), and thymine (T). Adenine always pairs with thymine, and guanine always pairs with cytosine. So a “rung” can be any of four things: A–T, T–A, C–G, or G–C.

Those four base-pair permutations are the symbols that comprise the code of life. Strings of them make up the genome as segments of various lengths called genes. Your genes at least partially control most of your physical and many of your mental traits—not only what color your eyes are and how tall you are but also what diseases you are susceptible to, how difficult it is for you to build muscle or lose weight, and even whether you’re prone to motion sickness.

How Long-Read Genome Sequencing Works

Long-read sequencing starts by breaking up a sample of genetic material into pieces that are often about 20,000 base pairs long. Then the sequencing technology reads the sequence of base pairs on those DNA strands to produce random segments, called “reads,” of DNA that are at least 10,000 pairs in length. Once those long reads are obtained, powerful bioinformatics software is used to build longer stretches of contiguous sequence by overlapping reads that share the same sequence of bases.

To understand the process, think of a genome as a novel, and each of its separate chromosomes as a chapter in the novel. Imagine shredding the novel into pieces of paper, each about 5 square centimeters. Your job is to reassemble them into the original novel (unfortunately for you, the pages aren’t numbered). What makes this task possible is overlap—you shredded multiple copies of the novel, and the pieces overlap, making it easier to see where one leaves off and another begins.

Making it much harder, however, are the many sections of the book filled with repetitive nonsense: the same word repeated hundreds or even thousands of times. At least half of a typical mammalian genome consists of these repetitive sequences, some of which have regulatory functions and others regarded as “junk” DNA that’s descended from ancient genes or viral infections and no longer functional. Long-read technology is adept at handling these repetitive sequences. Going back to the novel-shredding analogy, imagine trying to reassemble the book after it was shredded into pieces only 1 centimeter square rather than 5. That’s analogous to the challenge that researchers formerly faced trying to assemble million-base-pair DNA sequences using older, “short-read” sequencing technology.

The Two Approaches to Long-Read Sequencing

The long-read sequencing market has two leading companies—Oxford Nanopore Technologies (ONT) and Pacific Biosciences of California (PacBio)—which compete intensely. The two companies have developed utterly different systems.

The heart of ONT’s system is a flow cell that contains 2,000 or more extremely tiny apertures called, appropriately enough, nanopores. The nanopores are anchored in an electrically resistant membrane, which is integrated onto a sensor chip. In operation, each end of a segment of DNA is attached to a molecule called an adapter that contains a helicase enzyme. A voltage is applied across the nanopore to create an electric field, and the field captures the DNA with the attached adapter. The helicase begins to unzip the double-stranded DNA, with one of the DNA strands passing through the nanopore, base by base, and the other released into the medium.

OPTICAL SEQUENCING (Pacific Biosciences)

Chris Philpot

A polymerase enzyme replicates the DNA strand, matching and connecting each base to a specially engineered, complementary nucleotide. That nucleotide flashes light in a characteristic color that identifies which base is being connected.

Each DNA strand is immobilized at the bottom of a well.

As the DNA strand is replicated, each base while being incorporated emits a tiny flash of light in a color that is characteristic of the base. The sequence of light flashes indicates the sequence of bases.

What propels the strand through the nanopore is that voltage—it’s only about 0.2 volts, but the nanopore is only 5 nanometers wide, so the electric field is several hundred thousand volts per meter. “It’s like a flash of lightning going through the pore,” says David Deamer, one of the inventors of the technology. “At first, we were afraid we would fry the DNA, but it turned out that the surrounding water absorbed the heat.”

That kind of field strength would ordinarily propel the DNA-based molecule through the pore at speeds far too fast for analysis. But the helicase acts like a brake, causing the molecule to go through with a ratcheting motion, one base at a time, at a still-lively rate of about 400 bases per second. Meanwhile, the electric field also propels a flow of ions across the nanopore. This current flow is decreased by the presence of a base in the nanopore—and, crucially, the amount of the decrease depends on which of the four bases, A, T, G, or C, is entering the pore. The result is an electrical signal that can be rapidly translated into a sequence of bases.

NANOPORE SEQUENCING (Oxford Nanopore)

Chris Philpot

The helicase enzyme unzips and unravels the double-stranded DNA, and one strand enters the nanopore. The enzyme feeds the strand through the nanopore with a ratcheting motion, base by base.

The ionic current is reduced by a characteristic amount, depending on the base. The current signal indicates the sequence of bases.

PacBio’s machines rely on an optical rather than an electronic means of identifying the bases. PacBio’s latest process, which it calls HiFi, begins by capping both ends of the DNA segment and untwisting it to create a single-stranded loop. Each loop is then placed in an infinitesimally tiny well in a microchip, which can have 25 million of those wells. Attached to each loop is a polymerase enzyme, which serves a critical function every time a cell divides. It attaches to single-stranded DNA and adds the complementary bases, making each rung of the ladder whole again. PacBio uses special versions of the four bases that have been engineered to fluoresce in a characteristic color when exposed to ultraviolet light.

A UV laser shines through the bottom of the tiny well, and a photosensor at the top detects the faint flashes of light as the polymerase goes around the DNA sample loop, base by base. The upshot is that there is a sequence of light flashes, at a rate of about three per second, that reveals the sequence of base pairs in the DNA sample.

Because the DNA sample has been converted into a loop, the whole process can be repeated, to achieve higher accuracy, by simply going around the loop another time. PacBio’s flagship Revio machine typically makes five to 10 passes, achieving median accuracy rates as high as 99.9 percent, according to Aaron Wenger, senior director of product marketing at the company.

How Researchers Will Scale Up Long-Read Sequencing

That kind of accuracy doesn’t come cheap. A Revio system, which has four chips, each with 25 million wells, costs around $600,000, according to Wenger. It weighs 465 kilograms and is about the size of a large household refrigerator. PacBio says a single Revio can sequence about four entire human genomes in a 24-hour period for less than $1,000 per genome.

ONT claims accuracy above 99 percent for its flagship machine, called PromethION 24. It costs around $300,000, according to Rosemary Sinclair Dokos, chief product and marketing officer at ONT. Another advantage of the ONT PromethION system is its ability to process fragments of DNA with as many as a million base pairs. ONT also offers an entry-level system, called MinION Mk1D, for just $3,000. It’s about the size of two smartphones stacked on top of each other, and it plugs into a laptop, offering researchers a setup that can easily be toted into the field.

At the Centro Nacional de Análisis Genómico, in Barcelona, technician Álvaro Carreras prepares a PromethION long-read sequencing machine, from Oxford Nanopore Technologies, to sequence a genome. Behind Carreras is a Pacific Biosciences Revio long-read machine.

Luigi Avantaggiato

Although researchers often have strong preferences, it’s not uncommon for a state-of-the-art genetics laboratory to be equipped with machines from both companies. At Barcelona’s Centro Nacional de Análisis Genómico, for example, researchers have access to both PacBio Revio machines as well as PromethION 24 and GridION machines from ONT.

Durbin, at Cambridge University, sees lots of upside in the current situation. “It’s very good to have two companies,” he declares. “They’re in competition with each other for the market.” And that competition will undoubtedly fuel the tech advances that the EBP’s backers are counting on to get the project across the finish line.

A technician at the Centro Nacional de Análisis Genómico, in Barcelona, holds a flow cell for a PromethION long-read sequencing machine from Oxford Nanopore Technologies. The flow cell contains a chip that interacts with the sample of DNA to perform the long-read sequencing.

Luigi Avantaggiato

PacBio’s Wenger notes that the 25-million-well chips that underpin its Revio system are still being fabricated on 200-millimeter semiconductor wafers. A move to 300-mm wafers and more advanced lithographic techniques, he says, would enable them to get many more chips per wafer and put hundreds of millions of wells on each of those chips—if the market demands it.

At ONT, Dokos describes similar math. A single flow cell now consists of more than 2,000 nanopores, and a state-of-the-art PromethION 24 system can have 24 flow cells (or upward of 48,000 nanopores) running in parallel. But a future system could have hundreds of thousands of nanopores, she says—again, if the market demands it.

The EBP will need all of those advances, and more. EBP director Lewin notes that after seven years, the three-phase initiative is wrapping up phase one and preparing for phase two. The goal for phase two is to sequence 150,000 genomes between 2026 and 2030. For phase two, “We’ve got to get to 37,500 genomes per year,” Lewin says. “Right now, we’re getting close to 3,000 per year.” In phase two, the cost per genome sequenced will also have to decline from roughly $26,000 per genome in phase one to $6,100, according to the EBP’s official road map. That $6,100 figure includes all costs—not just sequencing but also sampling and the other stages needed to produce a finished genome, with all of the genes identified and assigned to chromosomes.

A technician at the Centro Nacional de Análisis Genómico, in Barcelona, introduces a sample of fragmented DNA for sequencing in a PromethION machine from Oxford Nanopore Technologies.

Luigi Avantaggiato

Phase three will up the ante even higher. The road map calls for more than 1.65 million genome sequences between 2030 and 2035 at a cost of $1,900 per genome. If they can pull it off, the entire project will have cost roughly $4.7 billion—considerably less in real terms than what it cost to do just the human genome 22 years ago. All of the data collected—the genome sequences for all named species on Earth—will occupy a little over 1 exabyte (1 billion gigabytes) of digital storage.

It will arguably be the most valuable exabyte in all of science. “With this genomic data, we can get to one of the questions that Darwin asked a long time ago, which is, How does a species arise? What is the origin of species? That’s his famous book where he never actually answered the question,” says Mark Blaxter, who leads the Darwin Tree of Life Project at the Wellcome Sanger Institute near Cambridge and who also conceived and started Project Psyche. “We’ll get a much, much better idea about what it is that makes a species and how species are distinct from each other.”

A portion of that knowledge will come from the many moths collected on those summer nights in the Italian Alps. Lepidoptera “go back around 300 million years,” says Charlotte Wright, a co-leader, along with Blaxter, of Project Psyche. Analyzing the genomes of huge numbers of species will help explain why some branches of the Lepidoptera order have evolved far more species than others, she says.

And that kind of knowledge should eventually accumulate into answers to some of biology’s most profound questions about evolution and the mechanisms by which it acts. “The amazing thing is that by doing this for all of the lepidoptera of Europe, we aren’t just learning about individual cases,” says Wright. “We’ve learned across all of it.”

Keep Reading ↓ Show less

AI-Powered Tools Enhance Global Innovation Strategies

How IP.com supports the future of IP, AI, and global competitiveness

4 min read
Person in leather jacket holding binoculars; digital data graphics overlay.
IP.com

This is a sponsored article brought to you by IP.com

Around the world, innovation is no longer just a function of invention. It is a strategic asset, deeply linked to economic resilience—and increasingly reliant on ever-evolving technologies like artificial intelligence (AI). As intellectual property ecosystems transform under this new reality, the need for advanced, efficient, ethical AI-supported innovation infrastructure has never been more urgent.

The U.S. Patent and Trademark Office (USPTO) has emerged as a global leader in this space, signaling its commitment to responsible AI leadership through the AI and Emerging Technologies (ET) Partnership, and most recently, through its landmark guidance clarifying that human contribution is essential for inventorship in AI-assisted creations and its significant investment in deploying AI to improve prior art searches and overall examination quality.

As governments, enterprises, and inventors look to chart a path through this rapidly shifting IP landscape, one innovation intelligence company has been boldly helping lead the way: IP.com.

For over 20 years, IP.com has worked to empower public and private innovation stakeholders alike with an ever-expanding suite of AI-powered tools that enhance ideation, evaluate novelty, clarify patent landscapes, and protect inventive work around the globe.

Today, as the world rethinks how to balance open innovation with strategic security, IP.com’s solutions are not only timely—they are essential.

Aligning A National Vision with Global Innovation

While each country faces its own innovation challenges and opportunities, there is a growing consensus around one idea: the systems that support innovation—including those for AI and intellectual property—must be modern, data-driven, future-ready, and responsible.

That vision was echoed in the U.S. government’s recent Executive Order on artificial intelligence (EO 14179), and the newly published “America’s AI Action Plan” which emphasize the need for safe and trustworthy AI systems to drive economic growth and national security. While the Executive Order speaks primarily to U.S. priorities, it reflects a global movement toward building responsible AI systems that deliver long-term value, foster trust, and safeguard the integrity of innovation.

IP.com’s innovation platforms are deeply aligned with this broader call, providing AI solutions that are secure and responsible from the ground up. From operating in a private environment to being fully ITAR-compliant, IP.com’s approach delivers IP-advancing AI solutions that are safe and accountable so inventors, R&D teams, IP professionals, and federal agencies can work smarter, faster, and more securely.

Shaping the Future of Responsible, Innovation-Supportive AI

IP.com’s dual-engine AI-fueled Innovation Power (IP) Suite® mirrors how high-performing teams think while aligning closely with the strategies and priorities shaping the future of intellectual property. It responsibly integrates AI into intellectual property processes to foster innovation within a secure, inclusive framework free from ideological bias or hallucinations.

The IP Suite is purpose-built to deliver actionable insights grounded in proprietary data. With security integrated at every level—including ITAR compliance and private AI environments—IP.com offers best-in-class capabilities that protect sensitive data while accelerating innovation workflows. Transparent and traceable outputs reinforce trust and accountability across the innovation lifecycle while empowering forward-thinking teams and simplifying complex technology landscapes.

IP.com’s dual-engine AI-fueled Innovation Power Suite is purpose-built to deliver actionable insights grounded in proprietary data.

That means inventors and engineers can use the IP Suite to safely push innovation boundaries at a rapid pace. By integrating ideation, quantitative novelty analysis, prior art analysis, and invention disclosure generation into one simple, intuitive AI workflow, IP decisions can be made better and faster to maximize ROI. As global innovation accelerates, the broader collaboration IP.com enables through its AI-fueled IP Suite is essential to aligning stakeholders around shared priorities and helping to build a resilient, secure, and future-ready IP ecosystem.

A Bold Force in Global IP Advancement

Though best known among insiders in patent law and innovation strategy, IP.com has shaped some of the most important developments in IP modernization over the past two decades. Used by patent trademark offices around the world, its enterprise-grade, class-leading semantic AI increases examiner efficiency, powers millions of prior art searches, and accelerates the IP and innovation work of inventors, engineers, and IP professionals alike.

Today, IP.com’s tools serve clients ranging from small inventors to multinational R&D operations. And as patent filings increase globally and emerging technologies blur jurisdictional boundaries, its commitment remains the same: helping innovators everywhere create confidently, compete fairly, and protect what matters—their intellectual property.

Addressing IP Theft Head-On

As the threat of IP theft intensifies, organizations across the public and private sectors are reevaluating how and where they deploy AI. Foreign-backed open and consumer-grade AI solutions, while powerful, often operate in opaque environments with unclear data handling practices, raising serious concerns about data leakage. For entities managing high-value innovation or sensitive research, the risk of proprietary data being exposed or exploited through unsecured AI models has become a pressing issue.

IP.com’s Innovation Power (IP) Suite® is purpose-built to meet this challenge. Designed for enterprise and public-sector use, the platform operates entirely within secure, explainable, and ITAR-compliant environments—ensuring that no prompts, queries, or intellectual property are ever shared with external models or third parties. This architecture preserves data sovereignty while also upholding innovation ethics. Furthermore, the IP Suite helps US companies and government agencies protect innovations while countering the risk of IP theft. IP.com’s secure and ITAR-compliant solutions are uniquely positioned to help enable rapid, secure AI adoption in sensitive environments.

As lines between economic competition and cyber-espionage continue to blur, IP.com stands apart from competitors built on open-source models vulnerable to exploitation. IP.com offers a proven, trustworthy path forward for organizations seeking to innovate responsibly while safeguarding their most valuable ideas.

Beyond IP Security: The IEEE Content Advantage

IP.com further enhances innovation processes by offering engineers direct access to fully searchable IEEE content—one of the most trusted and timely sources of technical knowledge in the world. Whether evaluating the novelty of a new design or researching prior art, engineers benefit from the ability to explore a vast collection of peer-reviewed journals, conference proceedings, and technical standards—all integrated within IP.com’s AI-powered IP Suite.

By embedding IEEE content directly into the research workflow, IP.com empowers engineering teams to make more informed technical and strategic decisions. During concept validation or patentability assessments, having authoritative, high-quality IEEE literature at their fingertips helps engineers validate ideas, identify gaps in the landscape, and avoid costly duplication of effort. Combined with IP.com’s advanced analytics and private, secure AI tools, the integration of IEEE content ensures that engineers not only innovate efficiently but do so with the clarity and depth of insight required in today’s fast-moving R&D environments.

Few platforms offer this kind of integration. Fewer still deliver it with the semantic precision and ease of use that IP.com provides.

Final Thoughts: A Shared Responsibility

Innovation is borderless. Its challenges—be they technical, strategic, or ethical—are shared across geographies. And so must be its solutions.

IP.com is committed to supporting innovation ecosystems worldwide with tools that uphold the values of fairness, security, and excellence. Whether advancing a single patent application or shaping and managing an entire IP portfolio, our mission remains clear: to help innovators move forward—smarter, faster, and together.

Discover how IP.com supports innovation ecosystems worldwide at www.ip.com/AI

Keep Reading ↓ Show less
{"imageShortcodeIds":[]}

Array-Scale MUT Simulations Powered by the Cloud

How full array-scale PMUT and CMUT simulations capture system-level effects such as beam patterns and cross-talk.

1 min read

Designing and optimizing ultrasound transducers—whether PMUTs or CMUTs—requires accuracy at scale.

Yet traditional simulation approaches are often constrained to individual cells or limited structures, leaving important array-level effects poorly understood until expensive and time-consuming testing begins.

This gap can lead to longer development cycles and higher risk of failed devices.

In this webinar, we will introduce the improved approach: full array-scale MUT simulations with fully coupled multiphysics.

By leveraging Quanscient’s cloud-native platform, engineers can model entire transducer arrays with all relevant physical interactions (electrical, mechanical, acoustic, and more) capturing system-level behaviors such as beam patterns and cross-talk that single-cell simulations miss.

Cloud scalability also enables extensive design exploration.

Through parallelization, users can run Monte Carlo analyses, parameter sweeps, and large-scale models in a fraction of the time, enabling rapid optimization and higher throughput in the design process.

This not only accelerates R&D but ensures more reliable designs before fabrication.

The session will feature real-world case examples with detailed insights of the methodology and key metrics.

Attendees will gain practical understanding of how array-scale simulation can greatly improve MUT design workflows reducing reliance on costly prototypes, minimizing risk, and delivering better device performance.

Join us to learn how array-scale MUT simulations in the cloud can improve MUT design accuracy, efficiency, and reliability.

Register now for this free webinar!

Keep Reading ↓ Show less

How Wi-Fi Signals Can Help Track Your Heartbeat

The advance opens up new contactless heart-monitoring possibilities

3 min read
Over the shoulder view of researching reviewing line graph data on their laptop

University of California, Santa Cruz researchers have developed Pulse-Fi, a prototype system to remotely measure people's pulse rates using ambient Wi-Fi signals.

Erika Cardema/UC Santa Cruz

This article is part of our exclusive IEEE Journal Watch series in partnership with IEEE Xplore.

Wi-Fi signals today primarily transmit data. But these signals can also be used for other innovative purposes. For instance, one California-based team has proposed using ambient Wi-Fi signals to monitor a person’s heart rate.

The new approach, called Pulse-Fi, offers advantages over existing heart-rate-monitoring methods. It’s low-cost and easily deployable, and it sidesteps the need for people to strap a device to their body.

Katia Obraczka is a professor at the University of California, Santa Cruz, who led the development of Pulse-Fi. She notes that continuous tracking of vital signs, including heat rate, can help flag health concerns such as stress, dehydration, cardiac disease, and other illnesses. “But using wearables to monitor vitals can be uncomfortable, have weak adherence, and have limited accessibility due to cost,” she says.

Camera-based methods are one option for remote, contactless tracking of a person’s heart rate without a wearable device. However, these approaches may be compromised in poor lighting conditions and may also raise privacy concerns.

In the search for a better option, Obraczka, along with postdoc student Nayan Sanjay Bhatia and high-school intern Pranay Kocheta working in her lab, sought to create Pulse-Fi. “Pulse-Fi uses ordinary Wi-Fi signals to monitor your heartbeat without touching you. It captures tiny changes in the Wi-Fi signal waves caused by heartbeats,” says Obraczka.

How Can Wi-Fi Signals Measure Someone’s Pulse?

Specifically, the team designed Pulse-Fi to filter out background noise and detect the changes in signal amplitude brought about by heartbeats. They developed an AI model—capable of running on a simple computing device, such as a Raspberry Pi—which then reads the filtered signals and estimates heart rate in real time.

The team tested their approach in two different experiments, which are described in a study published in August at the 2025 International Conference on Distributed Computing in Smart Systems and the Internet of Things.

First, the researchers had seven volunteers sit in a chair at various distances of 1, 2, and 3 meters from two ESP32 microcontrollers that used Pulse-Fi to estimate the volunteers’ heart rates, comparing these data to heart-rate measurements taken by a pulse oximeter. In the second experiment, Pulse-Fi was used on Raspberry Pi devices to monitor the heart rates of more than 100 participants in different positions, including walking, running in place, sitting down, and standing up.

The results show that the system performs on par with other reference sensors, and Pulse-Fi’s less than 1.5-beats-per-minute error rate compares favorably to other vital-sign-monitoring technologies. Pulse-Fi also maintained sufficient accuracy despite the person’s posture (for example, sitting or walking) or distance from the recording device (up to 10 feet away). Based on these results, Obraczka says the team plans to establish a company to commercialize the technology.

Prof. Katia Obraczka and her Ph.D. student Nayan Sanjay Bhatia—both of the University of California, Santa Cruz—discuss research on their Pulse-Fi technology, which remotely measures people’s pulse rates using Wi-Fi signals. Erika Cardema/UC Santa Cruz

Obraczka adds that Pulse-Fi can work in new environments that its underlying AI model hadn’t trained for. “The model generalized well in [a new] setting, showing it’s not just memorizing, but actually learning patterns that transfer to new situations,” she says.

Obraczka also notes that the devices that Pulse-Fi runs on are affordable–with ESP32 chips costing about US $5 to $10, with Raspberry Pis costing about $30.

As yet, the researchers have tested Pulse-Fi on only a single user in the room at a time. The team is now beginning to pilot the approach on multiple users simultaneously. “In addition to working on multiuser environments, we are also exploring other wellness and health care applications for Pulse-Fi,” Obraczka says, citing sleep apnea and breathing rates as examples.

Keep Reading ↓ Show less

Hardware Access Could Fuel African Innovation

Access to hardware could transform engineering education in Africa

6 min read
A Black man sitting on a table surrounded by 3D printers and cables.

Engineer Bainomugisha is a computer science professor at Makerere University, in Kampala, Uganda, where the AirQo air-quality monitor was developed by his team.

Andrew Esiebo

My name is Engineer Bainomugisha. Yes, Engineer is my first name and also my career. My parents named me Engineer, and they recognized engineering traits in me from childhood, such as perseverance, resilience, and wanting to understand how things work.

I grew up and spent my early years in a rural part of Uganda, more than 300 kilometers outside of Kampala, the capital city. As a young boy, I was always tinkering and hustling: I harvested old radio batteries to power lighting, created household utensils from wood, and herded animals and sold items to help the village make money.

Keep Reading ↓ Show less
{"imageShortcodeIds":[]}

Entering a New Era of Modeling and Simulation

Companies using simulation have a lot to gain, but software skills are a limiting factor. Apps open the playing field.

6 min read
Laptop displaying IGBT thermal analysis software on a circuit board background.
COMSOL

This is a sponsored article brought to you by COMSOL.

Computer modeling and simulation has been used in engineering for many decades. At this point, anyone working in R&D is likely to have either directly used simulation software or indirectly used the results generated by someone else’s model. But in business and in life, “the best laid plans of mice and men can still go wrong.” A model is only as useful as it is realistic, and sometimes the spec changes at a pace that is difficult to keep up with or is not fully known until later in the development process.

Keep Reading ↓ Show less

Revolutionizing Software Supply Chain Security: A Holistic Approach

New tools and strategies to close critical security gaps in software supply chains

1 min read

Despite significant investments, software supply chains remain vulnerable, evidenced by breaches affecting major enterprises. The Security Systems Research Center (SSRC) at the Technology Innovation Institute (TII) is pioneering new tools and frameworks to secure the entire software supply chain. From Software Bill of Materials (SBOM) generation and vulnerability management automation to Zero Trust development pipelines, SSRC's innovations aim to protect enterprises and governments against escalating cyber threats. This white paper outlines SSRC's comprehensive approach to bolstering software supply-chain security through key advancements like the Ghaf platform and an automated secure build pipeline.

Machine Learning Tests Keep Getting Bigger

And Nvidia keeps beating the competition on them

4 min read
Close-up of two racks of gold and black computers racks with black and gold wiring connecting the computers.

Nvidia topped MLPerf’s new reasoning benchmark with its new Blackwell Ultra GPU, packaged in a GB300 rack-scale design.

Nvidia

The machine learning field is moving fast, and the yardsticks used to measure its progress are having to race to keep up. A case in point: MLPerf, the biannual machine learning competition sometimes termed “the Olympics of AI,” has introduced three new benchmark tests, reflecting new directions in the field.

“Lately, it has been very difficult trying to follow what happens in the field,” says Miro Hodak, an Advanced Micro Devices engineer and MLPerf Inference working-group cochair. “We see that the models are becoming progressively larger, and in the last two rounds we have introduced the largest models we’ve ever had.”

Keep Reading ↓ Show less

AMD Takes Holistic Approach to AI Coding Copilots

The chip maker is using AI throughout the software development lifecycle

10 min read
Vertical
Human hand next to robotic circuit hand on green grid background.
Beyond Code Autocomplete
Nick Little
Purple

Coding assistants like GitHub Copilot and Codeium are already changing software engineering. Based on existing code and an engineer’s prompts, these assistants can suggest new lines or whole chunks of code, serving as a kind of advanced autocomplete.

At first glance, the results are fascinating. Coding assistants are already changing the work of some programmers and transforming how coding is taught. However, this is the question we need to answer: Is this kind of generative AI just a glorified help tool, or can it actually bring substantial change to a developer’s workflow?

Keep Reading ↓ Show less
{"imageShortcodeIds":[]}

Quantum Leap: Sydney’s Leading Role in the Next Tech Wave

As the country’s leading innovation hub, Sydney is rapidly emerging as a global leader in quantum technology

4 min read
A group of people in a research lab stand around a quantum device consisting of metal chambers, pipes, and wires.

One of several leading quantum startups in Sydney, Silicon Quantum Computing was founded by Michelle Simmons [front, left], a professor of physics at the University of New South Wales (UNSW) and Director of the Centre for Quantum Computation and Communication Technology in Australia.

BESydney

This is a sponsored article brought to you by BESydney.

Australia plays a crucial role in global scientific endeavours, with a significant contribution recognized and valued worldwide. Despite comprising only 0.3 percent of the world’s population, it has contributed over 4 percent of the world’s published research.

Keep Reading ↓ Show less

Virtual Mobile Infrastructure: The Next Generation of Mobile Security

How VMI protects government and enterprise mobile phones from evolving hacking threats

1 min read

Despite advancements in mobile security, hacking incidents targeting government and enterprise mobile phones continue to rise. Traditional mobile security tools fall short, as advanced spyware and vulnerabilities persist. Virtual Mobile Infrastructure (VMI) presents an innovative solution by isolating critical apps and sensitive data in the cloud, offering enhanced protection through a virtualized interface. The Secure Software Research Center's new VMI architecture, built on Zero Trust principles, aims to fill gaps left by existing solutions, delivering the highest level of security for BYOD mobile devices.

Can Hardware Security Protect Web3?

A cryptography expert on how Web3 started, and how it’s going

6 min read
Man in black t-shirt looks at laptop screen; behind him is a many-node network diagram in shades of blue.

Cryptography is at the root of whatever remains of Web3—and hardware cryptography is at the root of that.

Source Image: Cubist

The term Web3 was originally coined by Ethereum cofounder Gavin Wood as a secure, decentralized, peer-to-peer version of the Internet. The idea was to build an Internet based on blockchain technology and a peer-to-peer network, without the need for large data centers or third-party providers. These days, however, blockchain is most famous as the tool enabling cryptocurrencies. Most recently, the Trump administration has taken on a pro-cryptocurrency stance, boosting blockchain’s popularity and media prominence.

Cryptography is central to the functioning of blockchains, whether for a decentralized Web or for cryptocurrencies. Every time a cryptocurrency transaction is initiated, all parties involved in the transaction need to securely prove that they agree to the transfer. This is done via a digital signature: a cryptographic protocol that generates a secret, private key that is unique to each user and a public key that the user shares. Then, the private key is used to generate a unique signature for each transaction. The public key can be used to verify that, indeed, the signature was created by the holder of the private key. In this way, Web3 in every incarnation relies heavily on cryptography.

Keep Reading ↓ Show less
word word word word word word word word word word word word word word word word word word word word word word word word word word word word word word word word word word word word word word word word word word word word word word word word word word word word word word word word word word word word word word word word word word word word word word word word word word word word word word word word word word word word word word word word word word word word word word word word word word word word word word word word word word word word word word word word word word word word word word word word word word word word word word word word word word word word word word word word word word word word word word word word word word word word word word word word word word word word word word word word word word word word word word word word word word word word word word word word word word word word word word word word word word word word word word word word word word word word word word word word

mmMwWLliI0fiflO&1
mmMwWLliI0fiflO&1
mmMwWLliI0fiflO&1
mmMwWLliI0fiflO&1
mmMwWLliI0fiflO&1
mmMwWLliI0fiflO&1
mmMwWLliI0fiflO&1