How to catch a magnetic monopole in the act

Berkeley Lab-led study could lead to smaller memory devices, microelectronics, and spintronics

A research team led by the Department of Energy’s Lawrence Berkeley National Laboratory (Berkeley Lab) has created a nanoscale “playground” on a chip that simulates the formation of exotic magnetic particles called monopoles. The study – published recently in Science Advances – could unlock the secrets to ever-smaller, more powerful memory devices, microelectronics, and next-generation hard drives that employ the power of magnetic spin to store data.

Follow the ‘ice rules’
For years, other researchers have been trying to create a real-world model of a magnetic monopole – a theoretical magnetic, subatomic particle that has a single north or south pole. These elusive particles can be simulated and observed by manufacturing artificial spin ice materials – large arrays of nanomagnets that have structures analogous to water ice – wherein the arrangement of atoms isn’t perfectly symmetrical, leading to residual north or south poles.

>Read more on the Advanced Light Source at Berkeley Lab website

Image: Full image here. This  nanoscale “playground” on a chip uses nanomagnets to simulate the formation of exotic magnetic particles called “monopoles.” Credit: Farhan/Berkeley Lab

H2020 project PaNOSC officially started

The project PaNOSC, Photon and Neutron Open Science Cloud, is one of five cluster projects funded under the European H2020 programme.

The project, which will run until December 2022, is coordinated by the ESRF and brings together six strategic European research infrastructures.

Large-scale research infrastructures produce a huge amount of scientific data on a daily basis. For their storage and future (re)use, data need to managed according to the FAIR principles, i.e., be Findable, Accessible, Interoperable and Re-usable. The adaptation and development of both policies and technologies are key to making FAIR data a reality and to serving the broad set of stakeholders who will benefit from a coherent ecosystem of data services.

Under the headline “European Open Science Cloud (EOSC)”, projects covering a wide range of scientific disciplines from physics, astronomy, and life sciences, to social sciences and humanities, have been funded by the European Commission to build and develop the EOSC, which includes a comprehensive catalogue of services for the storage, management, analysis and re-use of research data.

>Read more on the ESRF website
>To know more about PaNOSC ( Photon and Neutron Open Science Cloud ) please read here

Extremely small magnetic nanostructures with invisibility cloak

Future data storage technology

In novel concepts of magnetic data storage, it is intended to send small magnetic bits back and forth in a chip structure, store them densely packed and read them out later. The magnetic stray field generates problems when trying to generate particularly tiny bits. Now, researchers at the Max Born Institute (MBI), the Massachusetts Institute of Technology (MIT) and DESY were able to put an “invisibility cloak” over the magnetic structures. In this fashion, the magnetic stray field can be reduced, allowing for small yet mobile bits. The results were published in Nature Nanotechnology.

For physicists, magnetism is intimately coupled to rotating motion of electrons in atoms. Orbiting around the atomic nucleus as well as around their own axis, electrons generate the magnetic moment of the atom. The magnetic stray field associated with that magnetic moment is the property we know from e.g. a bar magnet we use to fix notes on pinboard. It is also the magnetic stray field that is used to read the information from a magnetic hard disk drive. In today’s hard disks, a single magnetic bit has a size of about 15 x 45 nanometer, about of those would fit on a stamp.

One vision for a novel concept to store data magnetically is to send the magnetic bits back and forth in a memory chip via current pulses, in order to store them at a suitable place in the chip and retrieve them later. Here, the magnetic stray field is a bit of a curse, as it prevents that the bits can be made smaller for even denser packing of the information. On the other hand, the magnetic moment underlying the stray field is required to be able to move the structures around.

>Read more on the PETRA III at DESY website

Credit: MIT, L. Caretta/M. Huang [Source]

Ultralow-fluence for phase-change process

Ultrafast active materials with tunable properties are currently investigated for producing successful memory and data-processing devices. Among others, Phase-Change Materials (PCMs) are eligible for this purpose. They can reversibly switch between a high-conductive crystalline state (SET) and a low-conductive amorphous state (RESET), defining a binary code. The transformation is triggered by an electrical or optical pulse of different intensity and time duration. 3D Ge-Sb-Te based alloys, of different stoichiometry, are already employed in DVDs or Blu-Ray Disks, but they are expected to function also in non-volatile memories and RAM. The challenge is to demonstrate that the scalability to 2D, 1D up to 0D of the GST alloys improves the phase-change process in terms of lower power threshold and faster switching time. Nowadays, GST thin films and nanoparticles have been synthetized and have beenshown to function with competitive results.
A team of researchers from the University of Trieste and the MagneDyn beamline at Fermi demonstrated the optical switch from crystalline to amorphous state of Ge2Sb2Te5nanoparticles (GST NPs) with size <10 nm, produced via magnetron sputtering by collaborators from the University of Groeningen. Details were reported in the journal Nanoscale.
This work aims at showing the very low power limit of an optical pulse needed to amorphize crystalline Ge2Sb2Te5 nanoparticles. Particles of 7.8 nm and 10.4 nm diameter size were deposited on Mica and capped with ~200nm of PMMA. Researchers made use of a table-top Ti:Sapphire regenerative amplified system-available at the IDontKerr (IDK) laboratory (MagneDyn beamline support laboratory) to produce pump laser pulses at 400 nm, of ~100 fs and with a repetition rate from 1kHz to single shot.

>Read more on the Elettra Sincrotrone Trieste website

Image (extract): Trasmission Electron Microscopy image of the nanoparticles sample. Ultafast single-shot optical process with fs-pulse at 400 nm. Microscope images of amorphized and amorphized/ablated areas obtained on the nanoparticles sample. Comparison of amorphization threshold fluences between thin films and nanoparticles cases.
Please see here the entire image.

Real-time ptychographic data streaming

CAMERA/ALS/STROBE Collaboration yields novel image data workflow pipeline.

What began nearly a decade ago as a Berkeley Lab Laboratory-Directed Research and Development (LDRD) proposal is now a reality, and it is already changing the way scientists run experiments at the Advanced Light Source (ALS)—and, eventually, other light sources across the Department of Energy (DOE) complex—by enabling real-time streaming of ptychographic image data in a production environment.

In scientific experiments, ptychographic imaging combines scanning microscopy with diffraction measurements to characterize the structure and properties of matter and materials. While the method has been around for some 50 years, broad utilization has been hampered by the fact that the experimental process was slow and the computational processing of the data to produce a reconstructed image was expensive. But in recent years advances in detectors and x-ray microscopes at light sources such as the ALS have made it possible to measure a ptychographic dataset in seconds.

>Read more on the Berkeley Lab website

Picture: The modular, scalable Nanosurveyor II system—now up and running at the ALS—employs a two-sided infrastructure that integrates the ptychographic image data acquisition, preprocessing, transmission and visualization processes.

Scientists use machine learning to speed discovery of metallic glass

In a new report, they combine artificial intelligence and accelerated experiments to discover potential alternatives to steel in a fraction of the time.

Blend two or three metals together and you get an alloy that usually looks and acts like a metal, with its atoms arranged in rigid geometric patterns.

But once in a while, under just the right conditions, you get something entirely new: a futuristic alloy called metallic glass that’s amorphous, with its atoms arranged every which way, much like the atoms of the glass in a window. Its glassy nature makes it stronger and lighter than today’s best steel, plus it stands up better to corrosion and wear.

Even though metallic glass shows a lot of promise as a protective coating and alternative to steel, only a few thousand of the millions of possible combinations of ingredients have been evaluated over the past 50 years, and only a handful developed to the point that they may become useful.

Now a group led by scientists at the Department of Energy’s SLAC National Accelerator Laboratory, the National Institute of Standards and Technology (NIST) and Northwestern University has reported a shortcut for discovering and improving metallic glass – and, by extension, other elusive materials – at a fraction of the time and cost.

>Read more on the SLAC website

Image: Fang Ren, who developed algorithms to analyze data on the fly while a postdoctoral scholar at SLAC, at a Stanford Synchrotron Radiation Lightsource beamline where the system has been put to use.
Credit: Dawn Harmer/SLAC National Accelerator Laboratory

10 out of 10

Diamond will have processed 10 petabytes of data over its 10 years of research and innovation.

Today, on the 10th day of the 10th month of the year, Diamond will have processed 10 petabytes of data over its 10 years of research and innovation. To put this into perspective, 10 petabytes (1 x 1016 bytes) is equivalent to over 2 million DVDs, 200 million four-draw filing cabinet filled with text, the entire memory of four human brains or 20,000 years of MP3 songs playing continuously.

Collected during experiments on over 30 beamlines and integrated facilities, in the past 10 years Diamond data has fostered breakthroughs in fields ranging from health, the environment and engineering to astrophysics and archaeology. And new beamlines, improved capabilities and growing numbers of users mean that Diamond is processing more data than ever before. In fact, Diamond’s Data Acquisition team processed almost 2 petabytes of data – a fifth of all data processed at Diamond during its working lifetime – in 2016 alone.