Study elucidates evolution of mosquitoes and their hosts

 

Researchers at North Carolina State University and global collaborators have mapped the mosquito’s tree of life, a major step toward understanding important traits, such as how the insects choose their hosts, feed on blood and spread disease. The findings will help researchers make better predictions to model disease transmission and understand what makes some mosquitoes better disease carriers than others.

 

The research suggests that mosquito evolution over the past 200 million years mirrors the Earth’s history of shifting land masses and changing host organisms, says Dr. Brian Wiegmann, William Neal Reynolds Professor of Entomology at NC State and corresponding author of a paper describing the mosquito family tree, published in Nature Communications.

 

Read the full article at: phys.org

Microsoft is hiring a nuclear energy expert to help power data centers with small nuclear reactors

 

Artificial intelligence takes a lot of compute power, and Microsoft is putting together a road map for powering that computation with small nuclear reactors. That’s according to a job description Microsoft posted Thursday seeking a nuclear technology expert to lead the company’s technical assessment for integrating small modular nuclear reactors and microreactors “to power the datacenters that the Microsoft Cloud and AI reside on,” the job posting reads.

 

Specifically, Microsoft is looking to hire a “principal program manager for nuclear technology” and that person “will be responsible for maturing and implementing a global Small Modular Reactor (SMR) and microreactor energy strategy,” the job posting reads. Microsoft is looking to generate energy with nuclear fission, which is when an atom splits and releases energy as a result of that splitting.

 

News of this job description was first reported on DCD, a website about data centers. In January 2023, Microsoft announced a multiyear, multibillion-dollar investment in OpenAI, maker of viral AI chatbot ChatGPT. Bill Gates, Microsoft’s co-founder, is also the chairman of the board of TerraPower, a nuclear innovation company in the process of developing and scaling small modular reactor designs. TerraPower “does not currently have any agreements to sell reactors to Microsoft,” a spokesperson told CNBC. However, Microsoft has publicly committed to pursuing nuclear energy from an innovator in the fusion space.

 

In May, Microsoft announced it signed a power purchase agreement with Helion, a nuclear fusion startup, to buy electricity from it in 2028. Sam Altman, CEO of OpenAI, is an early and significant investor in Helion. Nuclear fusion occurs when two smaller atomic nuclei smash together to form a heavier atom and release tremendous quantities of energy in the process. It is the way in which the sun makes power. Fusion has not yet been recreated at scale here on earth, but many venture-backed startups are working to make it a reality due to the potential promise of virtually unlimited clean energy.

 

Interest in nuclear energy has increased alongside concerns about climate change in recent years, as nuclear reactors generate electricity without releasing virtually any carbon dioxide emissions.

The existing fleet of nuclear reactors in the U.S. was largely built between 1970 and 1990, and currently generates about 18% of the total electricity in the U.S., according to the U.S. Energy Information Administration. Nuclear energy also makes up 47% of America’s carbon-free electricity in 2022, according to the U.S. Department of Energy.

 

Much of the hope for the next generation of nuclear reactor technology in the U.S. is pinned on smaller nuclear reactors, which Microsoft’s job posting indicates the company is interested in using to power its data centers.

Read the full article at: www.cnbc.com

Adobe Firefly’s generative AI models can now create vector graphics in Adobe Illustrator

 

Adobe Illustrator is a widely used vector graphics tool for graphic artists and it’s about to join the generative AI era with the launch of the Firefly Vector Model at Adobe’s MAX conference today. Adobe describes the new model as “the world’s first generative AI model focused on producing vector graphics.” Like Firefly for creating images and photos, Firefly for Illustrator will be able to create entire vector graphics from scratch. And like the other Firefly models, the vector model, too, was trained on data from Adobe Stock.

 

In its beta, Illustrator will now let you create entire scenes through a text prompt. What’s nifty here is that those scenes can consist of multiple objects. So this isn’t just a jumble of vectors that make up the overall graphic but Illustrator will automatically generate these different objects and you can manipulate them individually to your heart’s content, just like any other group or layer in Illustrator.

 

Alexandru Costin, Adobe’s VP for generative AI and Sensei, told me the company used tens of millions of vector images in Adobe Stock to train Firefly to enable this new capability. Costin described the process as “a journey” and since there hasn’t been as much work done on using generative AI to create vector drawings compared to the work on creating other images, this surely took a bit more work on the team’s part. He noted that the team focused on creating a model that could generate these images with the fewest possible points, too.

 

Another new feature that’s coming to Illustrator is called Mockup, which allows Illustrator users to take any 3D scene and then take any vector art and apply it to that 3D scene. That could be a design for a drink can, for example, or a mockup of a new logo on a t-shirt. “Mockup is really exciting to show your customers the art in context so they understand what they’re buying when they contract you as a freelancer,” Costin explained. Also new is Retype, which converts static text in images to editable text — and it’ll find matching fonts, too — and Illustrator is now available on the web, too!

Read the full article at: techcrunch.com

After 15 years, pulsar timing yields evidence of cosmic background gravitational waves

 

The universe is humming with gravitational radiation—a very low-frequency rumble that rhythmically stretches and compresses spacetime and the matter embedded in it. That is the conclusion of several groups of researchers from around the world who simultaneously published a slew of journal articles in June describing more than 15 years of observations of millisecond pulsars within our corner of the Milky Way galaxy. At least one group—the North American Nanohertz Observatory for Gravitational Waves (NANOGrav) collaboration—has found compelling evidence that the precise rhythms of these pulsars are affected by the stretching and squeezing of spacetime by these long-wavelength gravitational waves.

 

“This is key evidence for gravitational waves at very low frequencies,” says Vanderbilt University’s Stephen Taylor, who co-led the search and is the current chair of the collaboration. “After years of work, NANOGrav is opening an entirely new window on the gravitational-wave universe.”

 

 

Read the full article at: phys.org

Severe space weather can mess up bird migrations, a new study indicates

 

New research indicates that severe space weather events, such as solar flares, disrupt birds’ navigational skills during long migrations.

 

Previous research has indicated that when flying at night, birds (and many other animals) use Earth’s magnetic field for navigation. Because solar events disrupt the magnetic field — as well as produce auroras — birds have more difficulty navigating during them. The new study analyzed images taken from 37 NEXRAD Doppler weather radar stations, which can detect groups of migrating birds, as well as data from ground-based magnetometers, to study 23 years of bird migration across the U.S. Great Plains. The 1,000-mile (1,600-kilometer) span from North Dakota to Texas is considered a major migratory corridor for birds.

 

 

“The biggest challenge was trying to distill such a large dataset — years and years of ground magnetic field observations — into a geomagnetic disturbance index for each radar site,” Daniel Welling, University of Michigan space scientist said in a statement. “There was a lot of heavy lifting in terms of assessing data quality and validating our final data product to ensure that it was appropriate for this study.”

 

The work paid off. The researchers discovered that the number of migrating birds in this region decreases by 9 to 17 percent during severe space weather events. They also noticed increased rates of birds becoming lost during migration, a phenomenon known as migratory bird vagrancy.

Read the full article at: www.space.com

Scientists Use CRISPR to Make Chickens More Resistant to Bird Flu

 

A new study highlights both the promise and the limitations of gene editing, as a highly lethal form of avian influenza continues to spread around the world.

 

Scientists have used the gene-editing technology known as CRISPR to create chickens that have some resistance to avian influenza, according to a new study that was published in the journal Nature Communications on Tuesday. The study suggests that genetic engineering could potentially be one tool for reducing the toll of bird flu, a group of viruses that pose grave dangers to both animals and humans. But the study also highlights the limitations and potential risks of the approach, scientists said. Some breakthrough infections still occurred, especially when gene-edited chickens were exposed to very high doses of the virus, the researchers found. And when the scientists edited just one chicken gene, the virus quickly adapted.

 

The findings suggest that creating flu-resistant chickens will require editing multiple genes and that scientists will need to proceed carefully to avoid driving further evolution of the virus, the study’s authors said. The research is “proof of concept that we can move toward making chickens resistant to the virus,” Wendy Barclay, a virologist at Imperial College London and an author of the study, said at a news briefing. “But we’re not there yet.” Some scientists who were not involved in the research had a different takeaway. “It’s an excellent study,” said Dr. Carol Cardona, an expert on bird flu and avian health at the University of Minnesota. But to Dr. Cardona, the results illustrate how difficult it will be to engineer a chicken that can stay a step ahead of the flu, a virus known for its ability to evolve swiftly. “There’s no such thing as an easy button for influenza,” Dr. Cardona said. “It replicates quickly, and it adapts quickly.”

What to Know About Avian Flu

The spread of H5N1. A new variant of this strain of the avian flu has spread widely through bird populations in recent years. It has taken an unusually heavy toll on wild birds and repeatedly spilled over into mammals, including minks, foxes and bears.

 

 

Research cited published in Nature Comm. (Oct. 10, 2023):

https://doi.org/10.1038/s41467-023-41476-3 

Read the full article at: www.nytimes.com

The Future of AI is Here: Cerebras’ WSE-2 is the largest computer chip ever built and the fastest AI processor on Earth

 

Cluster-Scale Performance on a Single Large-Wafer Chip

Programming a cluster to scale deep learning is painful. It typically requires dozens to hundreds of engineering hours and remains a practical barrier for many to realize the value of large-scale AI for their work. On a traditional GPU cluster, ML researchers – typically using a special version of their ML framework – must figure out how to distribute their model while still achieving some fraction of their convergence and performance target. They must navigate the complex hierarchy of individual processors’ memory capacity, bandwidth, interconnect topology, and synchronization; all while performing a myriad of hyper-parameter and tuning experiments along the way. What’s worse is that the resultant implementation is brittle to change, and this time only delays overall time to solution. With the WSE, there is no bottleneck. We give you a cluster-scale AI compute resource with the programming ease of a single desktop machine using stock TensorFlow or PyTorch. Spend your time in AI discovery, not cluster engineering.

Learn more

 

 

Designed for AI
 

Each core on the WSE is independently programmable and optimized for the tensor-based, sparse linear algebra operations that underpin neural network training and inference for deep learning, enabling it to deliver maximum performance, efficiency, and flexibility. The WSE-2 packs 850,000 of these cores onto a single processor. With that, and any data scientist can run state-of-the-art AI models and explore innovative algorithmic techniques at record speed and scale, without ever touching distributed scaling complexities.

 

1000x Memory Capacity and Bandwidth

 

Unlike traditional devices, in which the working cache memory is tiny, the WSE-2 takes 40GB of super-fast on-chip SRAM and spreads it evenly across the entire surface of the chip. This gives every core single-clock-cycle access to fast memory at extremely high bandwidth – 20 PB/s. This is 1,000x more capacity and 9,800x greater bandwidth than the leading GPU. This means no trade-off is required. You can run large, state-of-the art models and real-world datasets entirely on a single chip. Minimize wall clock training time and achieve real-time inference within latency budgets, even for large models and datasets.

220Pb/s

 

 
High Bandwidth – Low Latency
 

Deep learning requires massive communication bandwidth between the layers of a neural network. The WSE uses an innovative high bandwidth, low latency communication fabric that connects processing elements on the wafer at tremendous speed and power efficiency. Dataflow traffic patterns between cores and across the wafer are fully configurable in software. The WSE-2 on-wafer interconnect eliminates the communication slowdown and inefficiencies of connecting hundreds of small devices via wires and cables. It delivers an incredible 220 Pb/s processor-processor interconnect bandwidth. That’s more than 45,000x the bandwidth delivered between graphics processors.

Read the full article at: www.cerebras.net

The Ethics of AI in Content Creation: Balancing Automation with Originality

With recent developments in generative AI, the question of ethical content creation and the use of human-made content has come into question. And while the generative AI industry is still in its infancy, many companies must take measures to balance automation with original high-quality content.

It’s still very early to tell which direction the generative AI industry will take and what limitations will be placed on generative AI platforms. So for now, companies using this technology are the ones responsible for guaranteeing ethical use and protecting the rights of content creators. Here is how some companies can find a balance between automation and originality.

 

Learn more / En savoir plus / Mehr erfahren:

 

https://gustmees.wordpress.com/?s=curation

 

https://gustmees.wordpress.com/?s=blogging

 

https://globaleducationandsocialmedia.wordpress.com/2014/01/19/pkm-personal-professional-knowledge-management/

 

https://www.scoop.it/topic/21st-century-learning-and-teaching/?&tag=Blogging

 

https://www.scoop.it/topic/21st-century-learning-and-teaching/?&tag=content+marketing

 

https://www.scoop.it/topic/21st-century-learning-and-teaching/?&tag=SEO

 

 

Read the full article at: blog.scoop.it

Pioneers of mRNA COVID Vaccines Win Nobel Prize for Medicine

 

Katalin Karikó and Drew Weissman laid the groundwork for immunizations that were rolled out during the pandemic at record-breaking speed. This year’s Nobel Prize in Physiology or Medicine has been awarded to biochemist Katalin Karikó and immunologist Drew Weissman for discoveries that enabled the development of mRNA vaccines against COVID-19. The vaccines have been administered more than 13 billion times, saved millions of lives and prevented millions of cases of severe COVID-19, said the Nobel committee. Karikó, who is at Szeged University in Hungary, and Weissman, at the University of Pennsylvania in Philadelphia (UPenn), paved the way for the vaccines’ development by finding a way to deliver genetic material called messenger RNA into cells without triggering an unwanted immune response. They will each receive an equal share of the prize, which totals 11 million Swedish krona (US$1 million). Karikó is the 13th female scientist to win a Nobel Prize in medicine or physiology. She was born in Hungary, and moved to the United States in the 1980s. “Hopefully, this prize will inspire women and immigrants and all of the young ones to persevere and be resilient. That’s what I hope,” she tells Nature.

 

https://doi.org/10.1038/d41586-023-03046-x

Read the full article at: www.nature.com

The Newest and Largest Starlink Satellites Are Also the Faintest

 

Despite being larger than the original Starlink satellites, the new “Mini” version is fainter, meeting astronomers’ recommendations.

 

 

SpaceX launched their first batch of second-generation Starlink satellites on February 27th. These spacecraft are called “Mini,” but they are only small in comparison to the full-size satellites that will come later. The 116 square meters of surface area make them more than four times the size of the first-generation spacecraft.

The Minis’ large dimensions were an immediate concern for professional and amateur astronomers alike because area usually translates to brightness. However, SpaceX changed their physical design and concept of operations (conops) in order to mitigate their brightness. The company developed a highly reflective dielectric mirror film and a low-reflectivity black paint, which are applied to several parts of the spacecraft body. The mirror-like surface reflects sunlight into space instead of scattering it toward observers on the ground. In addition, the solar panels can be oriented so that observers do not see their sunlit sides.

 

 

The brightness mitigation plan sounded promising but measurements were needed to determine its effectiveness. So, a group of satellite observers began recording magnitudes. Scott Harrington recorded the first data point visually on March 14th. He has since obtained 125 additional magnitudes from his dark-sky location in Arkansas. Meanwhile, Andreas Hornig developed software to process video observations. He derived 108 magnitude measurements recorded from Macedonia on the night of April 12th alone. In all, we have acquired 506 brightness measurements for our study.

 

SpaceX launched three additional batches of 21 or more Mini satellites in April, May, and June. These spacecraft ascend from low, orbit-insertion heights toward their eventual altitude at 560-km (350 mi). Until May, we were observing Mini satellites at all heights without knowing whether they were operating for brightness mitigation. Then Richard Cole in the UK noticed that some spacecraft had leveled off at 480 km. He reasoned that these satellites might already be in mitigation mode and suggested that we prioritize them.

 

We found that the Minis at that height were several magnitudes fainter than those at other altitudes. SpaceX sent us a message on May 16th confirming that Richard was correct. Now that we could distinguish between mitigated and unmitigated spacecraft, we began to characterize the brightness of each group, prioritizing measurements for those satellites that were already operational.

Observed brightness indicates how severely satellites impact celestial observations. The average magnitude for mitigated Mini spacecraft in our database is 7.1, just below the limit set by astronomers’ recommended guidelines. So, most of them are invisible to the unaided eye and do not interfere greatly with research.

Read the full article at: skyandtelescope.org