One of the main challenges in curing cancer is that unlike foreign invaders, tumor cells are part of the body and so able to hide in plain sight. Now researchers have found a way to turn white blood cells into precision guided missiles that can sniff out these wolves in sheep’s clothing.
One of the biggest breakthroughs in treating cancer in recent years has been the emergence of CAR T-cell therapies, which recruit the body’s immune system to fight tumors rather than relying on radiotherapy or powerful chemotherapy drugs that can have severe side effects.
The approach relies on T-cells, the hunter-killer white blood cells that seek out and destroy pathogens. Therapies involve drawing blood from the patient, separating their T-cells, and then genetically engineering them to produce “chimeric antigen receptors” (CARs) that target specific proteins called antigens on the surface of cancer cells. They are then re-administered to the patient to track down and destroy cancer cells.
The only problem is that very few cancers have unique antigens. Unlike the pathogens the T-cells are used to hunting, tumor cells are not that dissimilar to the body’s other cells and often share many antigens. That means there’s a risk of T-cells targeting the wrong cells and causing serious damage to healthy tissue. As a result the only therapies approved by the FDA so far are focused on blood cancers that affect cells with idiosyncratic antigens.
Now though, researchers at the University of Washington have found a way to help T-cells target a far broader range of cancers. They’ve developed a system of proteins that can carry out logic operations just like a computer, which helps them target specific combinations of antigens that are unique to certain cancers.
“T cells are extremely efficient killers, so the fact that we can limit their activity on cells with the wrong combination of antigens yet still rapidly eliminate cells with the correct combination is game-changing,” said Alexander Salter, one of the lead authors of the study published in Science.
Their technique relies on a series of synthetic proteins that can be customized to create a variety of switches. These can be combined to carry out the AND, OR, and NOT operations at the heart of digital computing, which makes it possible to create instructions that focus on unique combinations of antigens such as “target antigen 1 AND antigen 2 but NOT antigen 3.”
When the correct collection of antigens is present, the proteins combine to create a kind of molecular beacon that can guide CART-cells to the tumor cell. To demonstrate the effectiveness of the approach, they showed how it helped CAR T-cells pick out and destroy specific tumor cells in a mixture of several different cell types.
Most other approaches for helping target T-cells are either only able to do basic AND operations to combine two antigens, or rely on engineering the targeting into the T-cells themselves, which is far more complicated. There are still significant barriers to overcome, though.
For a start, the bespoke nature of CAR T-cell therapy means it can be extremely expensive—as high as $1.5m—so access to this technology should it make it to the clinic will be limited. So far the researchers have only studied the proteins’ behavior in vitro, so it’s unclear how the body’s immune system would respond to them if they were injected into a human.
There are also other barriers to treating solid tumors using T-cells beyond simply targeting the correct cells. T-cells struggle to get inside large masses of cancer cells, and even if they do, these tumors often produce proteins that inhibit the effectiveness of the T-cells.
This new protein logic system is still a major breakthrough in the fight against cancer, though. And the researchers point out the technique could be used to target all kinds of different biomedical processes, including gene therapies where you need to deliver DNA to a specific kind of cell. The potential applications of this new missile guidance system for cells are only just starting to be explored.
Image Credit: National Institutes of Health/Alex Ritter, Jennifer Lippincott Schwartz, Gillian Griffiths]]>
If you weren’t already convinced the digital world is taking over, you probably are now.
To keep the economy on life support as people stay home to stem the viral tide, we’ve been forced to digitize interactions at scale (for better and worse). Work, school, events, shopping, food, politics. The companies at the center of the digital universe are now powerhouses of the modern era—worth trillions and nearly impossible to avoid in daily life.
Six decades ago, this world didn’t exist.
A humble microchip in the early 1960s would have boasted a handful of transistors. Now, your laptop or smartphone runs on a chip with billions of transistors. As first described by Moore’s Law, this is possible because the number of transistors on a chip doubled with extreme predictability every two years for decades.
But now progress is faltering as the size of transistors approaches physical limits, and the money and time it takes to squeeze a few more onto a chip are growing. There’ve been many predictions that Moore’s Law is, finally, ending. But, perhaps also predictably, the company whose founder coined Moore’s Law begs to differ.
In a keynote presentation at this year’s Hot Chips conference, Intel’s chief architect, Raja Koduri, laid out a roadmap to increase transistor density—that is, the number of transistors you can fit on a chip—by a factor of 50.
“We firmly believe there is a lot more transistor density to come,” Koduri said. “The vision will play out over time—maybe a decade or more—but it will play out.”
Why the optimism?
Calling the end of Moore’s Law is a bit of a tradition. As Peter Lee, vice president at Microsoft Research, quipped to The Economist a few years ago, “The number of people predicting the death of Moore’s Law doubles every two years.” To date, prophets of doom have been premature, and though the pace is slowing, the industry continues to dodge death with creative engineering.
Koduri believes the trend will continue this decade and outlined the upcoming chip innovations Intel thinks can drive more gains in computing power.
First, engineers can further shrink today’s transistors. Fin field effect transistors (or FinFET) first hit the scene in the 2010s and have since pushed chip features past 14 and 10 nanometers (or nodes, as such size checkpoints are called). Korduri said FinFET will again triple chip density before it’s exhausted.
FinFET will hand the torch off to nanowire transistors (also known as gate-all-around transistors).
Here’s how they’ll work. A transistor is made up of three basic components: the source, where current is introduced, the gate and channel, where current selectively flows, and the drain. The gate is like a light switch. It controls how much current flows through the channel. A transistor is “on” when the gate allows current to flow, and it’s off when no current flows. The smaller transistors get, the harder it is to control that current.
FinFET maintained fine control of current by surrounding the channel with a gate on three sides. Nanowire designs kick that up a notch by surrounding the channel with a gate on four sides (hence, gate-all-around). They’ve been in the works for years and are expected around 2025. Koduri said first-generation nanowire transistors will be followed by stacked nanowire transistors, and together, they’ll quadruple transistor density.
Growing transistor density won’t only be about shrinking transistors, but also going 3D.
This is akin to how skyscrapers increase a city’s population density by adding more usable space on the same patch of land. Along those lines, Intel recently launched its Foveros chip design. Instead of laying a chip’s various “neighborhoods” next to each other in a 2D silicon sprawl, they’ve stacked them on top of each other like a layer cake. Chip stacking isn’t entirely new, but it’s advancing and being applied to general purpose CPUs, like the chips in your phone and laptop.
Koduri said 3D chip stacking will quadruple transistor density.
The technologies Koduri outlines are an evolution of the same general technology in use today. That is, we don’t need quantum computing or nanotube transistors to augment or replace silicon chips yet. Rather, as it’s done many times over the years, the chip industry will get creative with the design of its core product to realize gains for another decade.
Last year, veteran chip engineer Jim Keller, who at the time was Intel’s head of silicon engineering but has since left the company, told MIT Technology Review there are over a 100 variables driving Moore’s Law (including 3D architectures and new transistor designs). From the standpoint of pure performance, it’s also about how efficiently software uses all those transistors. Keller suggested that with some clever software tweaks “we could get chips that are a hundred times faster in 10 years.”
But whether Intel’s vision pans out as planned is far from certain.
Intel’s faced challenges recently, taking five years instead of two to move its chips from 14 nanometers to 10 nanometers. After a delay of six months for its 7-nanometer chips, it’s now a year behind schedule and lagging other makers who already offer 7-nanometer chips. This is a key point. Yes, chipmakers continue making progress, but it’s getting harder, more expensive, and timelines are stretching.
The question isn’t if Intel and competitors can cram more transistors onto a chip—which, Intel rival TSMC agrees is clearly possible—it’s how long will it take and at what cost?
That said, demand for more computing power isn’t going anywhere.
Amazon, Microsoft, Alphabet, Apple, and Facebook now make up a whopping 20 percent of the stock market’s total value. By that metric, tech is the most dominant industry in at least 70 years. And new technologies—from artificial intelligence and virtual reality to a proliferation of Internet of Things devices and self-driving cars—will demand better chips.
There’s ample motivation to push computing to its bitter limits and beyond. As is often said, Moore’s Law is a self-fulfilling prophecy, and likely whatever comes after it will be too.
Image credit: Laura Ockel / Unsplash]]>
IBM Doubles Its Quantum Computer Performance
Stephen Shankland | CNET
“There’s now a race afoot to make the fastest quantum computer. What makes the quantum computing competition different from most in the industry is that rivals are taking wildly different approaches. It’s like a race pitting a horse against a car against an airplane against a bicycle.”
750 Million Genetically Engineered Mosquitos Approved for Release in Florida Keys
Sandee LaMotte | CNN
“…the pilot project is designed to test if a genetically modified mosquito is a viable alternative to spraying insecticides to control the Aedes aegypti. It’s a species of mosquito that carries several deadly diseases, such as Zika, dengue, chikungunya, and yellow fever.”
A Rocket Scientist’s Love Algorithm Adds Up During Covid-19
Stephen Marche | Wired
“Online dating is way up, with more than half of users saying they have been on their dating apps more during lockdown than before. …Just as local businesses had to rush onto delivery platforms, and offices had to figure out Zoom meeting schedules, so the hard realities of the disease have pushed love in the direction it was already going: fully online.”
How a Designer Used AI and Photoshop to Bring Ancient Roman Emperors Back to Life
James Vincent | The Verge
“Machine learning is a fantastic tool for renovating old photos and videos. So much so that it can even bring ancient statues to life, transforming the chipped stone busts of long-dead Roman emperors into photorealistic faces you could imagine walking past on the street.”
What If We Could Live for a Million Years?
Avi Loeb | Scientific American
“With advances in bioscience and technology, one can imagine a post-Covid-19 future when most diseases are cured and our life span will increase substantially. If that happens, how would our goals change, and how would this shape our lives?”
A Radical New Model of the Brain Illuminates Its Wiring
Grace Huckins | Wired
“‘The brain literally is a network,’ agrees Olaf Sporns, a professor of psychological and brain sciences at Indiana University. ‘It’s not a metaphor. I’m not comparing apples and oranges. I think this is literally what it is.’ And if network neuroscience can produce a clearer, more accurate picture of the way that the brain truly works, it may help us answer questions about cognition and health that have bedeviled scientists since Broca’s time.”
How Life Could Continue to Evolve
Caleb Scharf | Nautilus
“…the ultimate currency of life in the universe may be life itself: The marvelous genetic surprises that biological and technological Darwinian experimentation can come up with given enough diversity of circumstances and time. Perhaps, in the end, our galaxy, and even our universe, is simply the test tube for a vast chemical computation exploring a mathematical terrain of possibilities that stretches on to infinity.”
British Grading Debacle Shows Pitfalls of Automating Government
Adam Satariano | The New York Times
“Those who have called for more scrutiny of the British government’s use of technology said the testing scandal was a turning point in the debate, a vivid and easy-to-understand example of how software can affect lives.”
Image credit: ESA/Hubble & NASA, J. Lee and the PHANGS-HST Team; Acknowledgment: Judy Schmidt (Geckzilla)]]>
Though 5G—a next-generation speed upgrade to wireless networks—is scarcely up and running (and still nonexistent in many places) researchers are already working on what comes next. It lacks an official name, but they’re calling it 6G for the sake of simplicity (and hey, it’s tradition). 6G promises to be up to 100 times faster than 5G—fast enough to download 142 hours of Netflix in a second—but researchers are still trying to figure out exactly how to make such ultra-speedy connections happen.
A new chip, described in a paper in Nature Photonics by a team from Osaka University and Nanyang Technological University in Singapore, may give us a glimpse of our 6G future. The team was able to transmit data at a rate of 11 gigabits per second, topping 5G’s theoretical maximum speed of 10 gigabits per second and fast enough to stream 4K high-def video in real time. They believe the technology has room to grow, and with more development, might hit those blistering 6G speeds.
But first, some details about 5G and its predecessors so we can differentiate them from 6G.
Electromagnetic waves are characterized by a wavelength and a frequency; the wavelength is the distance a cycle of the wave covers (peak to peak or trough to trough, for example), and the frequency is the number of waves that pass a given point in one second. Cellphones use miniature radios to pick up electromagnetic signals and convert those signals into the sights and sounds on your phone.
4G wireless networks run on millimeter waves on the low- and mid-band spectrum, defined as a frequency of a little less (low-band) and a little more (mid-band) than one gigahertz (or one billion cycles per second). 5G kicked that up several notches by adding even higher frequency millimeter waves of up to 300 gigahertz, or 300 billion cycles per second. Data transmitted at those higher frequencies tends to be information-dense—like video—because they’re much faster.
The 6G chip kicks 5G up several more notches. It can transmit waves at more than three times the frequency of 5G: one terahertz, or a trillion cycles per second. The team says this yields a data rate of 11 gigabits per second. While that’s faster than the fastest 5G will get, it’s only the beginning for 6G. One wireless communications expert even estimates 6G networks could handle rates up to 8,000 gigabits per second; they’ll also have much lower latency and higher bandwidth than 5G.
Terahertz waves fall between infrared waves and microwaves on the electromagnetic spectrum. Generating and transmitting them is difficult and expensive, requiring special lasers, and even then the frequency range is limited. The team used a new material to transmit terahertz waves, called photonic topological insulators (PTIs). PTIs can conduct light waves on their surface and edges rather than having them run through the material, and allow light to be redirected around corners without disturbing its flow.
The chip is made completely of silicon and has rows of triangular holes. The team’s research showed the chip was able to transmit terahertz waves error-free.
Nanyang Technological University associate professor Ranjan Singh, who led the project, said, “Terahertz technology […] can potentially boost intra-chip and inter-chip communication to support artificial intelligence and cloud-based technologies, such as interconnected self-driving cars, which will need to transmit data quickly to other nearby cars and infrastructure to navigate better and also to avoid accidents.”
Besides being used for AI and self-driving cars (and, of course, downloading hundreds of hours of video in seconds), 6G would also make a big difference for data centers, IoT devices, and long-range communications, among other applications.
Given that 5G networks are still in the process of being set up, though, 6G won’t be coming on the scene anytime soon; a recent whitepaper on 6G from Japanese company NTTDoCoMo estimates we’ll see it in 2030, pointing out that wireless connection tech generations have thus far been spaced about 10 years apart; we got 3G in the early 2000s, 4G in 2010, and 5G in 2020.
In the meantime, as 6G continues to develop, we’re still looking forward to the widespread adoption of 5G.
Image Credit: Hans Braxmeier from Pixabay]]>
If you were to stack up all the electronic waste produced annually around the world it would weigh as much as all the commercial aircraft ever produced, or 5,000 Eiffel towers. This is a growing “tsunami” according to the UN, and it’s fed by all the phones, tablets and other electronic devices that are thrown away each day.
Of the 44.7 million metric tons of electronic waste (often shortened to “e-waste”) produced around the world in 2017, 90 percent was sent to landfill, incinerated, or illegally traded. Europe and the US accounted for almost half of this; the EU is predicted to produce 12 million tons in 2020 alone. If nothing is done to combat the problem, the world is expected to produce more than 120 million tons annually by 2050.
Rich countries in Europe and North America export much of their e-waste to developing countries in Africa and Asia. A lot of this ends up accumulating in landfills, where toxic metals leach out and enter groundwater and food chains, threatening human health and the environment.
As daunting as this problem seems, we’re working on a solution. Using a process called bioleaching, we’re extracting and recycling these metals from e-waste using non-toxic bacteria.
It might surprise you to learn that those toxic metals are actually very valuable. It’s a bitter irony that the e-waste mountains collecting in the world’s poorest places actually contain a fortune. Precious metals are found in your phone and computer, and each year $21 billion worth of gold and silver are used to manufacture new electronic devices. E-waste is thought to contain seven percent of the world’s gold, and could be used to manufacture new products if it could be recycled safely.
With an estimated worth of $62.5 billion a year, the economic benefits of recycling e-waste are clear. And it would help meet the shortfall for new natural resources that are needed to manufacture new products. Some of the elements on a printed circuit board, essentially the brain of a computer, are raw materials whose supply is at risk.
Other elements found in electronics are considered some of the periodic table’s most endangered. There is a serious threat that they will be depleted within the next century. With today’s trends of natural resource use, natural sources of indium will be depleted in about 10 years, platinum in 15 years and silver in 20 years.
But recovering these materials is more difficult than you might imagine.
Pyrometallurgy and hydrometallurgy are the current technologies used for extracting and recycling e-waste metals. They involve high temperatures and toxic chemicals, and so are extremely harmful to the environment. They require lots of energy and produce large volumes of toxic gas too, creating more pollution and leaving a large carbon footprint.
But bioleaching has existed as a solution to these problems as far back as the era of the Roman Empire. The modern mining industry has relied on it for decades, using microbes (mainly bacteria, but also some fungi) to extract metals from ores.
Microorganisms chemically modify the metal, setting it free from the surrounding rock and allowing it to dissolve in a microbial soup, from which the metal can be isolated and purified. Bioleaching requires very little energy and so has a small carbon footprint. No toxic chemicals are used either, making it environmentally friendly and safe.
Despite how useful it is, applying bioleaching to e-waste has mostly been an academic pursuit. But our research group is leading the first industrial effort. In a recent study, we reported how we managed to extract copper from discarded computer circuit boards using this method and recycle it into high-quality foil.
Different metals have different properties, so new methods must be constantly developed. Extracting metals by bioleaching, though pollution-free, is also slower than the traditional methods. Thankfully though, genetic engineering has already shown that we can improve how efficiently these microbes can be used in green recycling.
After our success recycling metals from discarded computers, scientists are trying other types of e-waste, including electric batteries. But developing better recycling techniques is only one piece of the puzzle. For a completely circular economy, recycling should start with manufacturers and producers. Designing devices that are more easily recycled and tackling the throw-away culture that treats the growing problem with indifference are both equally vital in slowing the oncoming tsunami.
This article is republished from The Conversation under a Creative Commons license. Read the original article.
Image Credit: INESby from Pixabay]]>
The UN population forecast predicts that by 2050 there will be almost 10 billion people on the planet. They’ll live mostly in cities and have an older median age than the current global population. One looming questions is, what will they eat?
The Green Revolution of the 1960s used selective breeding to double crop yields of rice and wheat in some areas of the world, rescuing millions of people from food shortages and even famine. Now, the fast-growing global population combined with the impact of climate change on our ability to produce food—increased droughts and extreme weather events many crops can’t withstand—points to the need for another green revolution.
Luckily there’s already one underway. It’s more decentralized than the last, which makes sense given there are different challenges surfacing in different parts of the world. A Norwegian startup called Desert Control has a running start on solving a problem that’s only likely to get worse with time.
Desertification—human and climate-caused degradation of land in dry areas—is on the rise in multiple parts of the world; more than two billion hectares of land that was once productive has been degraded. That’s an area twice the size of China. Meanwhile, the UN estimates that by 2030 we’ll need an additional 300 million hectares of land for food production.
Desert Control’s technology not only keeps land from degrading further, it actually transforms arid, poor-quality soil into nutrient-rich, food-growing soil.
Here’s how it works.
Sandy soil doesn’t retain water and nutrients; they run right through it and end up in the groundwater below. Desert Control developed a substance it calls Liquid Nanoclay (LNC), which coats sand particles with a layer of clay 1.5 nanometers thick. The coating allows moisture to stick and absorb to the sand. That means water and nutrients stick around, too, creating ideal conditions for plants to grow.
Picture the technology as a sort of giant sponge inserted just below ground level. LNC is sprayed onto land using traditional irrigation systems (like sprinklers), saturating the soil down to the root level of the plants that will be grown there. The “sponge” holds moisture within itself—as sponges do—keeping that moisture from filtering down deeper where it would no longer reach plants’ roots, and enhancing the effects of fertilizer. There are no chemicals involved, just clay and water.
Perhaps most amazingly, it only takes seven hours to transform a piece of land from arid to arable with LNC application.
According to Desert Control’s website, a field test near Abu Dhabi yielded cauliflowers and carrots that were 108 percent bigger than those in the control area, and field tests in Egypt documented a four-fold increase in the yield of wheat. Most recently, LNC was used to grow watermelon, pearl millet, and zucchini in the desert outside Dubai.
Adding clay to soil isn’t a new idea; farmers have been doing it for centuries. What’s new is breaking the clay down to a nanoparticle level and getting a liquid substance that can be easily sprayed onto land. Farmers traditionally used heavy machinery to mix clay into soil; this way uses a lot less clay (10 times less, by the company’s estimate), and is more effective to boot.
Having plants on a stretch of land brings its temperature down and helps stop soil erosion, meaning Desert Control’s solution, if widely implemented, would truly, well, control the desert.
So why isn’t this herald of the 21st-century green revolution being shipped out and sprayed down as fast as the company can manage?
The answer, as is unfortunately often the answer with new technologies, is cost. Treating arid land with LNC costs $2 to $5 per square meter, far more than many farmers can afford. There are also still some uncertainties around whether the treatment impacts the broader ecosystem in any negative ways.
Desert Control is currently working on scaling up its production operations and bringing LNC’s costs down, hoping to ultimately make the product affordable to farmers in low-income countries.
If they succeed, they would surely make late Green Revolution pioneer Norman Borlaug proud.
Image Credit: Desert Control]]>
In college, I volunteered to have a needle jabbed into the fleshy part between my thumb and forefinger in the name of acupuncture. I had bruised the area earlier in a lab experiment. I went in thinking I was completely crazy to try it out.
I left thinking that the treatment probably should’ve hurt a lot more than it actually did—and that it might have actually helped with the pain.
One quirk of studying pharmacy at Peking University’s Medical School is that in senior year you have to familiarize yourself with traditional Chinese medicine (TCM). One prominent part of TCM is acupuncture, a practice of sticking sharp needles into specific sites on the body, called “acupoints,” with claims that it can reduce inflammation. It’s remarkably different than Western medicine, my professors said at the time, because these treatments target and stimulate the basic biological processes of some disorders such as pain to root it out—that is, resolve why you’re in pain—in contrast to Western pain medications such as opioids, acetaminophen, or ibuprofen, which only act as a balm or Band-Aid for pain.
I rolled my eyes and skipped that class. Yet recently, as the opioid epidemic raged across North America and the world, the need for better pain treatment options became more prominent than ever.
One idea is to specifically stimulate nerves that act as highway carriers of pain signals and block those signals like a DDoS attack—using some sort of sharp neural interface. If the brain can’t process pain signals coming in, then you don’t consciously feel any pain.
This month, a team from Harvard Medical School, Baylor College of Medicine, and China united East with West with another look at a revamped form of acupuncture—electroacupuncture, which hits the same acupoints as the practice has for centuries, but with mild electrical pulses.
In mice given life-threatening injections of an immune-stimulating chemical called lipopolysaccharide (LPS), the team found that electroacupunture could lower the amount of pro-inflammatory chemicals and double the mice’s survival rate—but only if the mice were first zapped with electricity. If acupuncture was provided after an LPS shock, then the mice showed even higher levels of inflammation. The results were published in the journal Neuron.
“We were really surprised to find that the same input has completely opposite outcomes in different disease stages,” said study author Dr. Qiufu Ma at Harvard. “Most Western medicine has been focusing on blocking the neural pathways of pain to relieve the symptoms, but there are so many pain pathways and so many ways to open each of them.”
You might find acupuncture kind of hokey. But the practice of stimulating nerves to control inflammation is well established in modern medicine.
Take vagus nerve stimulation (VNS). The vagus nerve is a long biological highway that runs from the base of your brain through the neck into the chest and abdomen. It’s a powerful one. For the brain, VNS is approved by the US Food and Drug Administration for various conditions, including epilepsy and depression that hasn’t responded to normal pharmaceutical treatments.
For the body, VNS is potentially helpful for inflammatory bowel disease, likely through a pathway that includes certain types of neurons in the spleen—a pathway that biologists dub the “somatosensory autonomic pathway.” These connections form an intricate web of nerves linking the brain and spine to organs, allowing the latter to function independently of conscious thought, hence the term “autonomic.” Think heart rate, digestion, breathing, waste disposal, arousal and—of note—inflammation.
This is where acupuncture comes in. “One core idea is that [acupuncture] stimulation at specific points…can long-distantly modulate internal organs,” the authors explained. The traditional idea is that this is through so-called “channels” that link the stimulated spot to the actual treatment region. Anatomical studies generally haven’t found direct physical structures that support the idea of channels. However, Ma and team said, the vagus nerve might serve a similar function. For example, at least half a dozen studies have found that stimulating the limb area acupoint can suppress systemic inflammation, partially through the vagus nerve.
The question is how.
To kill pain, the first question is what drives inflammation.
The authors first honed in on an acupoint called ST25 near the belly region. Back in the 90s, scientists found that pinching this spot can activate nerves that lead to the spleen, which produces immune cells and is a major regulator of inflammation.
They first gave the mice a 15-minute electroacupuncture session at the specific spot with 3mA pulses. Afterwards, they injected a powerful toxin called LPS, a life-threatening chemical that triggers vast immune reactions in the mice. LPS is often used to mimic severe bacterial infections in humans and widely used as a way to develop new antibacterial medication.
Remarkably, mice given the electrical zaps showed less amounts of pro-inflammatory chemicals in their blood—dubbed cytokines—than those without the stimulation. However, these effects were nixed in mice without a particular neuronal cell type—NPY cells—that partially form those phone lines between acupuncture and effect, suggesting that they’re necessary for acupuncture’s anti-inflammatory effects.
But it gets weirder. Jabbing the ST25 point only worked to reduce inflammation before the mice got the hyper-inflammatory shot. When given later, the team found that acupuncture actually boosted inflammation levels and drastically reduced the mice’s chance of survival.
Digging into the neural pathways behind these differences, the team found that the electro-needle jab on the belly was able to fine-tune the activity of neurons in pain and inflammation-relays between the spleen and the spine. However, after a hefty dose of LPS, acupuncture stirred up the “fight or flight” pathway through a different set of biological communication lines, which in turn ramped up inflammation to a dangerous level.
In another set of experiments, the team repeated the electrojabs on the mice’s hind legs, but at lower intensity. Here, the stimulation lowered pro-inflammatory molecules in the blood either before or after the LPS assault. This acupoint, dubbed ST36, seemed to trigger a different nerve pathway than the abdominal stimulation—the famous vagus nerve, known for its role in dampening inflammation and pain.
Bottom line: the two acupoints, both established multiple millennia ago based on trial-and-error, clearly regulate two separate nerve pathways in modern neuroscience.
The way they work is a bit different: the abdominal ST25 is a bit more finicky, in that it only helps as a preventative protector. Add in existing inflammation, and it makes things worse. The hind leg ST36, on the other hand, is a total champion at culling inflammation, but only if you zap it with a lighter intensity.
It’s hard to say how these results translate to traditional acupuncture, which doesn’t have the electrical stimulation aspect. In a way, electroacupuncture is a sort of bridge between the old and new: it brings acupuncture and its theories, rooted in thousands of years of history, into the future. An electro-acupuncture needle is in one view a minimally-invasive neural interface that only pierces the top of the skin. And yet, guided by ancient acupunctural maps, it manages to interface with our nerves to reduce inflammation and pain—at least for two points, and in mice.
Do I still roll my eyes at acupuncture? To be honest, yeah. But I’m willing to give it another look.
“Our study illustrated that electroacupuncture has neuroanatomic basis, but its efficacy and safety on humans need to be validated in clinical trials,” said Ma. “There’s still many questions unanswered about this medical practice and thus a lot of room to do more research.”
Image Credit: Wikimedia Commons]]>
One of the biggest barriers to the renewable energy revolution is working out how to store power when the sun doesn’t shine and the wind doesn’t blow. Now scientists have shown standard construction bricks can be converted into energy storage units, potentially turning our houses into giant batteries.
While lithium ion battery technology has seen dramatic price drops in recent years, most experts agree that they will remain too expensive for grid-scale storage. The relative scarcity of lithium also means they’re unlikely to be able to meet all our energy needs.
That’s driven considerable research into alternative ways of storing excess renewable energy, from much cheaper molten salt batteries to approaches that use spare electricity to compress air or pump water uphill before later releasing them to drive turbines.
But it seems a potential energy storage medium has been sitting under our noses all this time. In a paper in Nature Communications, researchers from Washington University in St. Louis have demonstrated that bricks bought from Home Depot can be treated with a simple chemical procedure to give them battery-like power storing capabilities.
The technique takes advantage of the brick’s porous structure to deposit a layer of a conducting polymer called PEDOT throughout the brick. This converts each brick into a supercapacitor, which is similar to a battery but typically trades faster charging times for lower storage capacity.
The researchers’ process involves first bathing the bricks in a hydrochloric acid vapor, which seeps into the pores and reacts with the iron oxide that gives the bricks their red color. This turns the iron oxide into a reactive form of iron, which then interacts with another gas that is flooded through the brick to create a thin film of PEDOT, an electrically conductive plastic.
This coating is actually a mat of nanofibers with a very large surface area, which increases its energy storage capacity. This PEDOT coating acts as an electrode, and the researchers also added a gel electrolyte to the bricks.
They showed that three small bricks were enough to power a green LED for ten minutes on a single charge. What’s more, a waterproof epoxy the researchers coated the bricks with had a knock-on effect of preventing water evaporation from the gel, which means the bricks can be charged and discharged for 10,000 cycles with only a 10 percent reduction in capacity.
The bricks are still a proof of concept rather than a ready-to-go solution to our energy storage needs; their energy density is just one percent that of lithium ion batteries. In a press release Julio D’Arcy, who led the study, said 50 bricks hooked up to a solar panel could provide emergency lighting for 5 hours. That’s a long way from storing enough power for our increasingly energy-hungry households.
D’Arcy also conceded to New Scientist that there is some concern that the acid treatment might affect the integrity of the bricks, to the extent that they might not be able to make up the main structural components of a building.
But it’s still early, and the team sees routes to significant improvements in performance. D’Arcy notes that the team is already working on ways to turn their nanofibers into composite materials containing other semiconductors, which they hope will boost capacity by a factor of ten. They’re also working on tweaks to the manufacturing process to boost speeds and bring down costs.
They’re not the only ones trying to boost the functionality of the humble brick. D’Arcy notes that one group has combined bricks with metal oxide nanoparticles to help filter pollution out of the air, and another has created bricks that can conduct electricity by incorporating electrodes made from carbon nanomaterials. Electric carmaker Tesla is also building solar power-generating roof tiles.
While there’s still a long way to go, it seems the houses of the future might be giant batteries that can charge themselves using abundant renewable energy.
Image Credit: D’Arcy Laboratory]]>
When Ralph Fisher, a Texas cattle rancher, set eyes on one of the world’s first cloned calves in August 1999, he didn’t care what the scientists said: He knew it was his old Brahman bull, Chance, born again. About a year earlier, veterinarians at Texas A&M extracted DNA from one of Chance’s moles and used the sample to create a genetic double. Chance didn’t live to meet his second self, but when the calf was born, Fisher christened him Second Chance, convinced he was the same animal.
Scientists cautioned Fisher that clones are more like twins than carbon copies: The two may act or even look different from one another. But as far as Fisher was concerned, Second Chance was Chance. Not only did they look identical from a certain distance, they behaved the same way as well. They ate with the same odd mannerisms; laid in the same spot in the yard. But in 2003, Second Chance attacked Fisher and tried to gore him with his horns. About 18 months later, the bull tossed Fisher into the air like an inconvenience and rammed him into the fence. Despite 80 stitches and a torn scrotum, Fisher resisted the idea that Second Chance was unlike his tame namesake, telling the radio program This American Life that “I forgive him, you know?”
In the two decades since Second Chance marked a genetic engineering milestone, cattle have secured a place on the front lines of biotechnology research. Today, scientists around the world are using cutting-edge technologies, from subcutaneous biosensors to specialized food supplements, in an effort to improve safety and efficiency within the $385 billion global cattle meat industry. Beyond boosting profits, their efforts are driven by an imminent climate crisis, in which cattle play a significant role, and growing concern for livestock welfare among consumers.
Gene editing stands out as the most revolutionary of these technologies. Although gene-edited cattle have yet to be granted approval for human consumption, researchers say tools like Crispr-Cas9 could let them improve on conventional breeding practices and create cows that are healthier, meatier, and less detrimental to the environment. Cows are also being given genes from the human immune system to create antibodies in the fight against Covid-19. (The genes of non-bovine livestock such as pigs and goats, meanwhile, have been hacked to grow transplantable human organs and produce cancer drugs in their milk.)
But some experts worry biotech cattle may never make it out of the barn. For one thing, there’s the optics issue: Gene editing tends to grab headlines for its role in controversial research and biotech blunders. Crispr-Cas9 is often celebrated for its potential to alter the blueprint of life, but that enormous promise can become a liability in the hands of rogue and unscrupulous researchers, tempting regulatory agencies to toughen restrictions on the technology’s use. And it’s unclear how eager the public will be to buy beef from gene-edited animals. So the question isn’t just if the technology will work in developing supercharged cattle, but whether consumers and regulators will support it.
Cattle are catalysts for climate change. Livestock account for an estimated 14.5 percent of greenhouse gas emissions from human activities, of which cattle are responsible for about two thirds, according to the United Nations’ Food and Agriculture Organization (FAO). One simple way to address the issue is to eat less meat. But meat consumption is expected to increase along with global population and average income. A 2012 report by the FAO projected that meat production will increase by 76 percent by 2050, as beef consumption increases by 1.2 percent annually. And the United States is projected to set a record for beef production in 2021, according to the Department of Agriculture.
For Alison Van Eenennaam, an animal geneticist at the University of California, Davis, part of the answer is creating more efficient cattle that rely on fewer resources. According to Van Eenennaam, the number of dairy cows in the United States decreased from around 25 million in the 1940s to around 9 million in 2007, while milk production has increased by nearly 60 percent. Van Eenennaam credits this boost in productivity to conventional selective breeding.
“You don’t need to be a rocket scientist or even a mathematician to figure out that the environmental footprint or the greenhouse gases associated with a glass of milk today is about one-third of that associated with a glass of milk in the 1940s,” she says. “Anything you can do to accelerate the rate of conventional breeding is going to reduce the environmental footprint of a glass of milk or a pound of meat.”
Modern gene-editing tools may fuel that acceleration. By making precise cuts to DNA, geneticists insert or remove naturally occurring genes associated with specific traits. Some experts insist that gene editing has the potential to spark a new food revolution.
Jon Oatley, a reproductive biologist at Washington State University, wants to use Crispr-Cas9 to fine tune the genetic code of rugged, disease-resistant, and heat-tolerant bulls that have been bred to thrive on the open range. By disabling a gene called NANOS2, he says he aims to “eliminate the capacity for a bull to make his own sperm,” turning the recipient into a surrogate for sperm-producing stem cells from more productive prized stock. These surrogate sires, equipped with sperm from prize bulls, would then be released into range herds that are often genetically isolated and difficult to access, and the premium genes would then be transmitted to their offspring.
Furthermore, surrogate sires would enable ranchers to introduce desired traits without having to wrangle their herd into one place for artificial insemination, says Oatley. He envisions the gene-edited bulls serving herds in tropical regions like Brazil, the world’s largest beef exporter and home to around 200 million of the approximately 1.5 billion head of cattle on Earth.
Brazil’s herds are dominated by Nelore, a hardy breed that lacks the carcass and meat quality of breeds like Angus but can withstand high heat and humidity. Put an Angus bull on a tropical pasture and “he’s probably going to last maybe a month before he succumbs to the environment,” says Oatley, while a Nelore bull carrying Angus sperm would have no problem with the climate.
The goal, according to Oatley, is to introduce genes from beefier bulls into these less efficient herds, increasing their productivity and decreasing their overall impact on the environment. “We have shrinking resources,” he says, and need new, innovative strategies for making those limited resources last.
Oatley has demonstrated his technique in mice but faces challenges with livestock. For starters, disabling NANOS2 does not definitively prevent the surrogate bull from producing some of its own sperm. And while Oatley has shown he can transplant sperm-producing cells into surrogate livestock, researchers have not yet published evidence showing that the surrogates produce enough quality sperm to support natural fertilization. “How many cells will you need to make this bull actually fertile?” asks Ina Dobrinski, a reproductive biologist at the University of Calgary who helped pioneer germ cell transplantation in large animals.
But Oatley’s greatest challenge may be one shared with others in the bioengineered cattle industry: overcoming regulatory restrictions and societal suspicion. Surrogate sires would be classified as gene-edited animals by the Food and Drug Administration, meaning they’d face a rigorous approval process before their offspring could be sold for human consumption. But Oatley maintains that if his method is successful, the sperm itself would not be gene-edited, nor would the resulting offspring. The only gene-edited specimens would be the surrogate sires, which act like vessels in which the elite sperm travel.
Even so, says Dobrinski, “That’s a very detailed difference and I’m not sure how that will work with regulatory and consumer acceptance.”
In fact, American attitudes towards gene editing have been generally positive when the modification is in the interest of animal welfare. Many dairy farmers prefer hornless cows—horns can inflict damage when wielded by 1,500-pound animals—so they often burn them off in a painful process using corrosive chemicals and scalding irons. In a study published last year in the journal PLOS One, researchers found that “most Americans are willing to consume food products from cows genetically modified to be hornless.”
Still, experts say several high-profile gene-editing failures in livestock and humans in recent years may lead consumers to consider new biotechnologies to be dangerous and unwieldy.
In 2014, a Minnesota startup called Recombinetics, a company with which Van Eenennaam’s lab has collaborated, created a pair of cross-bred Holstein bulls using the gene-editing tool TALENs, a precursor to Crispr-Cas9, making cuts to the bovine DNA and altering the genes to prevent the bulls from growing horns. Holstein cattle, which almost always carry horned genes, are highly productive dairy cows, so using conventional breeding to introduce hornless genes from less productive breeds can compromise the Holstein’s productivity. Gene editing offered a chance to introduce only the genes Recombinetics wanted. Their hope was to use this experiment to prove that milk from the bulls’ female progeny was nutritionally equivalent to milk from non-edited stock. Such results could inform future efforts to make Holsteins hornless but no less productive.
The experiment seemed to work. In 2015, Buri and Spotigy were born. Over the next few years, the breakthrough received widespread media coverage, and when Buri’s hornless descendant graced the cover of Wired magazine in April 2019, it did so as the ostensible face of the livestock industry’s future.
But early last year, a bioinformatician at the FDA ran a test on Buri’s genome and discovered an unexpected sliver of genetic code that didn’t belong. Traces of bacterial DNA called a plasmid, which Recombinetics used to edit the bull’s genome, had stayed behind in the editing process, carrying genes linked to antibiotic resistance in bacteria. After the agency published its findings, the media reaction was swift and fierce: “FDA finds a surprise in gene-edited cattle: antibiotic-resistant, non-bovine DNA,” read one headline. “Part cow, part… bacterium?” read another.
Recombinetics has since insisted that the leftover plasmid DNA was likely harmless and stressed that this sort of genetic slipup is not uncommon.
“Is there any risk with the plasmid? I would say there’s none,’’ says Tad Sonstegard, president and CEO of Acceligen, a Recombinetics subsidiary. “We eat plasmids all the time, and we’re filled with microorganisms in our body that have plasmids.” In hindsight, Sonstegard says his team’s only mistake was not properly screening for the plasmid to begin with.
While the presence of antibiotic-resistant plasmid genes in beef probably does not pose a direct threat to consumers, according to Jennifer Kuzma, a professor of science and technology policy and co-director of the Genetic Engineering and Society Center at North Carolina State University, it does raise the possible risk of introducing antibiotic-resistant genes into the microflora of people’s digestive systems. Although unlikely, organisms in the gut could integrate those genes into their own DNA and, as a result, proliferate antibiotic resistance, making it more difficult to fight off bacterial diseases.
“The lesson that I think is learned there is that science is never 100 percent certain, and that when you’re doing a risk assessment, having some humility in your technology product is important, because you never know what you’re going to discover further down the road,” she says. In the case of Recombinetics. “I don’t think there was any ill intent on the part of the researchers, but sometimes being very optimistic about your technology and enthusiastic about it causes you to have blinders on when it comes to risk assessment.”
The FDA eventually clarified its results, insisting that the study was meant only to publicize the presence of the plasmid, not to suggest the bacterial DNA was necessarily dangerous. Nonetheless, the damage was done. As a result of the blunder, a plan was quashed for Recombinetics to raise an experimental herd in Brazil.
Backlash to the FDA study exposed a fundamental disagreement between the agency and livestock biotechnologists. Scientists like Van Eenennaam, who in 2017 received a $500,000 grant from the Department of Agriculture to study Buri’s progeny, disagree with the FDA’s strict regulatory approach to gene-edited animals. Typical GMOs are transgenic, meaning they have genes from multiple different species, but modern gene-editing techniques allow scientists to stay roughly within the confines of conventional breeding, adding and removing traits that naturally occur within the species. That said, gene editing is not yet free from errors and sometimes intended changes result in unintended alterations, notes Heather Lombardi, division director of animal bioengineering and cellular therapies at the FDA’s Center for Veterinary Medicine. For that reason, the FDA remains cautious.
“There’s a lot out there that I think is still unknown in terms of unintended consequences associated with using genome-editing technology,” says Lombardi. “We’re just trying to get an understanding of what the potential impact is, if any, on safety.”
Bhanu Telugu, an animal scientist at the University of Maryland and president and chief science officer at the agriculture technology startup RenOVAte Biosciences, worries that biotech companies will migrate their experiments to countries with looser regulatory environments. Perhaps more pressingly, he says strict regulation requiring long and expensive approval processes may incentivize these companies to work only on traits that are most profitable, rather than those that may have the greatest benefit for livestock and society, such as animal well-being and the environment.
“What company would be willing to spend $20 million on potentially alleviating heat stress at this point?” he asks.
On a windy winter afternoon, Raluca Mateescu leaned against a fence post at the University of Florida’s Beef Teaching Unit while a Brahman heifer sniffed inquisitively at the air and reached out its tongue in search of unseen food. Since 2017, Mateescu, an animal geneticist at the university, has been part of a team studying heat and humidity tolerance in breeds like Brahman and Brangus (a mix between Brahman and Angus cattle). Her aim is to identify the genetic markers that contribute to a breed’s climate resilience, markers that might lead to more precise breeding and gene-editing practices.
“In the South,’’ Mateescu says, heat and humidity are a major problem. “That poses a stress to the animals because they’re selected for intense production—to produce milk or grow fast and produce a lot of muscle and fat.”
Like Nelore cattle in South America, Brahman are well-suited for tropical and subtropical climates, but their high tolerance for heat and humidity comes at the cost of lower meat quality than other breeds. Mateescu and her team have examined skin biopsies and found that relatively large sweat glands allow Brahman to better regulate their internal body temperature. With funding from the USDA’s National Institute of Food and Agriculture, the researchers now plan to identify specific genetic markers that correlate with tolerance to tropical conditions.
“If we’re selecting for animals that produce more without having a way to cool off, we’re going to run into trouble,” she says.
There are other avenues in biotechnology beyond gene editing that may help reduce the cattle industry’s footprint. Although still early in their development, lab-cultured meats may someday undermine today’s beef producers by offering consumers an affordable alternative to the conventionally grown product, without the animal welfare and environmental concerns that arise from eating beef harvested from a carcass.
Other biotech techniques hope to improve the beef industry without displacing it. In Switzerland, scientists at a startup called Mootral are experimenting with a garlic-based food supplement designed to alter the bovine digestive makeup to reduce the amount of methane they emit. Studies have shown the product to reduce methane emissions by about 20 percent in meat cattle, according to the New York Times.
In order to adhere to the Paris climate agreement, Mootral’s owner, Thomas Hafner, believes demand will grow as governments require methane reductions from their livestock producers. “We are working from the assumption that down the line every cow will be regulated to be on a methane reducer,” he told the New York Times.
Meanwhile, a farm science research institute in New Zealand, AgResearch, hopes to target methane production at its source by eliminating methanogens, the microbes thought to be responsible for producing the greenhouse gas in ruminants. The AgResearch team is attempting to develop a vaccine to alter the cattle gut’s microbial composition, according to the BBC.
Genomic testing may also allow cattle producers to see what genes calves carry before they’re born, according to Mateescu, enabling producers to make smarter breeding decisions and select for the most desirable traits, whether it be heat tolerance, disease resistance, or carcass weight.
Despite all these efforts, questions remain as to whether biotech can ever dramatically reduce the industry’s emissions or afford humane treatment to captive animals in resource-intensive operations. To many of the industry’s critics, including environmental and animal rights activists, the very nature of the practice of rearing livestock for human consumption erodes the noble goal of sustainable food production. Rather than revamp the industry, these critics suggest alternatives such as meat-free diets to fulfill our need for protein. Indeed, data suggests many young consumers are already incorporating plant-based meats into their meals.
Ultimately, though, climate change may be the most pressing issue facing the cattle industry, according to Telugu of the University of Maryland, which received a grant from the Bill and Melinda Gates Foundation to improve productivity and adaptability in African cattle. “We cannot breed our way out of this,” he says.
This article was originally published on Undark. Read the original article.
Image Credit: RitaE from Pixabay]]>
A College Kid’s Fake AI-Generated Blog Fooled Tens of Thousands. This Is How He Created It.
Karen Hao | MIT Technology Review
“At the start of the week, Liam Porr had only heard of GPT-3. By the end, the college student had used the AI model to produce an entirely fake blog under a fake name. It was meant as a fun experiment. But then one of his posts found its way to the number-one spot on Hacker News. Few people noticed that his blog was completely AI-generated. Some even hit ‘Subscribe.'”
The First Gene-Edited Squid in History Is a Biological Breakthrough
Emily Mullin | One Zero
“Scientists have long marveled at these sophisticated behaviors and have tried to understand why these tentacled creatures are so intelligent. Gene editing may be able to help researchers unravel the mysteries of the cephalopod brain. But until now, it’s been too hard to do—in part because cephalopod embryos are protected by a hard outer layer that makes manipulating them difficult.”
New Audio Deepfake AI Narrates Reddit Thread as David Attenborough
Luke Dormehl | Digital Trends
“Sir David Attenborough, wildlife documentary broadcaster and natural historian, is an international treasure who must be protected at all costs. Now 94, Attenborough is still finding new dark recesses to explore on Planet Earth—including the r/Relationships and r/AskReddit boards on Reddit. Well, kind of.”
Robotic Chameleon Tongue Snatches Nearby Objects in the Blink of an Eye
Michelle Hampson and Evan Ackerman | IEEE Spectrum
“Chameleons may be slow-moving lizards, but their tongues can accelerate at astounding speeds, snatching insects before they have any chance of fleeing. Inspired by this remarkable skill, researchers in South Korea have developed a robotic tongue that springs forth quickly to snatch up nearby items.”
Android Is Becoming a Worldwide Earthquake Detection Network
Dieter Bohn | The Verge
“It’s a feature made possible through Google’s strengths: the staggering numbers of Android phones around the world and clever use of algorithms on big data. As with its collaboration with Apple on exposure tracing and other Android features like car crash detection and emergency location services, it shows that there are untapped ways that smartphones could be used for something more important than doomscrolling.”
‘Terror Crocodile’ the Size of a Bus Fed on Dinosaurs, Study Says
Johnny Diaz | The New York Times
“‘Deinosuchus was a giant that must have terrorized dinosaurs that came to the water’s edge to drink,’ [Adam Cossette, a vertebrate paleobiologist who led the study,] said in a statement. ‘Until now, the complete animal was unknown. These new specimens we’ve examined reveal a bizarre, monstrous predator.'”
This Pacific Island Nation Plans to Raise Itself Above the Ocean to Survive Sea Level Rise
Adele Peters | Fast Company
“The previous president of Kiribati, a low-lying island nation in the Pacific, predicted that the country’s citizens would eventually become climate refugees, forced to relocate as sea level rise puts the islands underwater. But a new president elected in June now plans to elevate key areas of land above the rising seas instead.”
Digitizing Burning Man
Lucas Matney | Tech Crunch
“Going virtual is an unprecedented move for an event that’s mere existence already seems to defy precedent. …’I’ve fallen in love with this idea that at some point in the future, some PhD student in 300 years time is going to write a thesis on the first online Burning Man, because it does feel like an extraordinary moment of avant garde imagineering for what the future of human online interaction looks like,’ Cooke tells TechCrunch.”
Image credit: Francesco Ungaro / Unsplash]]>