Ode to E Pluribus Unum for Sunday November 21 2021
Video of a Green Flash
Video Credit & Copyright: Paolo Lazzarotti
Many think it is just a myth. Others think it is true but its cause isn't known. Adventurers pride themselves on having seen it.
It's a green flash from the Sun.
The truth is the green flash does exist and its cause is well understood. Just as the setting Sun disappears completely from view, a last glimmer appears startlingly green.
The effect is typically visible only from locations with a low, distant horizon, and lasts just a few seconds. A green flash is also visible for a rising Sun, but takes better timing to spot.
A dramatic green flash was caught on video last month as the Sun set beyond the Ligurian Sea from Tuscany, Italy. The second sequence in the featured video shows the green flash in real time, while the first is sped up and the last is in slow motion. The Sun itself does not turn partly green -- the effect is caused by layers of the Earth's atmosphere acting like a prism.
One Month in the Boiling Cauldron Life of Our Sun
There’s a lot to see besides those spectacular blasts.
Music of Angels and Archangels:
Music to Heal All Pains of the Body, Soul and Spirit, Calm the Mind…the Blending of New Age Music with Timeless Images. Three and a half hours of soul refreshment.
Save the URL and summon the calm beauty whenever you need a break.
Hubble Telescope Team Gets One Instrument Running Again
By Mike Wall
The Hubble Space Telescope appears to be bouncing back from its latest glitch.
Late last month, the famous scope suffered a problem with the synchronization of its internal communications, sending all five of its science instruments into a protective "safe mode." The Hubble team has been troubleshooting the issue ever since, and it just notched a big success.
"The Hubble team successfully recovered the Advanced Camera for Surveys instrument Nov. 7," NASA officials said in a statement today (Nov. 8). "The instrument has started taking science observations once again."
The team targeted the ACS first because "it faces the fewest complications should a lost message occur," NASA officials added. The other four Hubble instruments remain in safe mode, though technicians are working to bring them back as well.
"Over the past week, the mission team has continued investigating the root cause of the synchronization issues and has seen no additional problems," NASA officials said in today's statement. "The team will continue looking into possible short-term solutions this week and develop estimates for implementation. Once this occurs, the team will discuss returning the other instruments to operational status and resuming their science observations."
Hubble has overcome a number of challenges since its April 1990 launch. Famously, the observatory headed to orbit with a flawed primary mirror, which spacewalking astronauts fixed in December 1993.
That crewed visit was the first of five Hubble servicing missions, during which astronauts repaired, maintained and upgraded the venerable scope. During the last one, which occurred in May 2009, astronauts installed two new instruments on Hubble, the Cosmic Origins Spectrograph and the Wide Field Camera 3.
Those two instruments, along with the ACS, the Space Telescope Imaging Spectrograph and the Near Infrared Camera and Multi-Object Spectrometer, make up Hubble's main science payload today (though the scope's fine guidance sensors can, and sometimes do, make scientific observations). All remain offline at the moment except the ACS.
Hubble has been showing signs of its advanced age recently. This summer, for example, a glitch put the entire observatory in safe mode for more than a month.
Remember the Guy with a Beach Chair and Balloons Over LA?
Let’s see if Judson Graham’s strapping on 50 drone motors and a para-wing makes more sense.
Watch how he builds the system, then takes to the air without buying the farm. Looks like a lot of fun to me.
The Chip that Changed the World
Most of the wealth created since 1971 is a result of Intel’s 4004 microprocessor.
By Andy Kessler
The Intel 4004 microprocessor, 1971.
photo: getty images
The world changed on Nov. 15, 1971, and hardly anyone noticed. It is the 50th anniversary of the launch of the Intel 4004 microprocessor, a computer carved onto silicon, an element as plentiful on earth as sand on a beach. Microprocessors unchained computers from air-conditioned rooms and freed computing power to go wherever it is needed most. Life has improved exponentially since.
Back then, IBM mainframes were kept in sealed rooms and were so expensive companies used argon gas instead of water to put out computer-room fires. Workers were told to evacuate on short notice, before the gas would suffocate them. Feeding decks of punch cards into a reader and typing simple commands into clunky Teletype machines were the only ways to interact with the IBM computers. Digital Equipment Corp. sold PDP-8 minicomputers to labs and offices that weighed 250 pounds.
In 1969, Nippon Calculating Machine Corp. asked Intel to design 12 custom chips for a new printing calculator. Engineers Federico Faggin, Stanley Mazor and Ted Hoff were tired of designing different chips for various companies and suggested instead four chips, including one programmable chip they could use for many products. Using only 2,300 transistors, they created the 4004 microprocessor. Four bits of data could move around the chip at a time. The half-inch-long rectangular integrated circuit had a clock speed of 750 kilohertz and could do about 92,000 operations a second.
Intel introduced the 3,500-transistor, eight-bit 8008 in 1972; the 29,000-transistor, 16-bit 8086, capable of 710,000 operations a second, was introduced in 1978. IBM used the next iteration, the Intel 8088, for its first personal computer. By comparison, Apple’s new M1 Max processor has 57 billion transistors doing 10.4 trillion floating-point operations a second. That is at least a billionfold increase in computer power in 50 years. We’ve come a long way, baby.
When I met Mr. Hoff in the 1980s, he told me that he once took his broken television to a repairman, who noted a problem with the microprocessor. The repairman then asked why he was laughing.
Now that everyone has a computer in his pocket, one of my favorite movie scenes isn’t quite so funny. In “Take the Money and Run” (1969), Woody Allen’s character interviews for a job at an insurance company and his interviewer asks, “Have you ever had any experience running a high-speed digital electronic computer?” “Yes, I have.” “Where?” “My aunt has one.”
Silicon processors are little engines of human ingenuity that run clever code scaled to billions of devices. They get smaller, faster, cheaper and use less power every year as they spread like Johnny’s apple seeds through society. These days, everyone’s aunt has at least one. Today’s automobiles often need 50 or more microprocessors to drive down the road, although with the current chip shortage, many are sitting on lots waiting for chips.
Mobile computing paved the way for smartphones, robotic vacuum cleaners, autonomous vehicles, moisture sensors for crops—even GPS tracking for the migratory patterns of birds. It also has created an infinitely updatable world—bugs get fixed and new features are rolled out without a change in hardware.
The separation of hardware and software, of design and control, is underappreciated. It enables global supply chains, for good or bad (mostly good). Apple can design in California, manufacture anywhere, and add its software at any point in the manufacturing process.
I’m convinced that most wealth created since 1971 is a direct result of the 4004. All of tech. All of finance. All of retail—ask Walmart about its inventory system. Oil? Good luck finding and drilling it without smart machines.
So 50 years later, have we reached the limit? For the past 20 years, microprocessors have boosted performance by adding more computing cores per chip. That Apple M1 has 16 processor cores. Graphics processing units, often used for artificial intelligence and bitcoin mining, can have thousands of processor cores.
Someday Gordon Moore’s Law from 1965—the number of transistors per chip doubles about every two years—will poop out. Some day John von Neumann’s architecture of processor and memory, first described in 1945, will no longer meet our computing needs. My guess is that we have another decade or two to squeeze more gains out of our current chip technology and computer architecture.
Luckily for us, computing doesn’t stand still. The neural networks used by Amazon’s Alexa to recognize your voice and Google to pick out faces in photos won’t replace the microprocessor, but they will likely serve as a complementary technology that can scale up for the next 50 years. Google is about to introduce a next-generation Tensor chip, updating the ones found in the company’s Pixel 6 phones. It is basically an artificial-intelligence accelerator in your pocket, allowing your phone to adapt to you—your own personal neural network to train. It is exciting and scary at the same time. What will we do with all this power? That question is like seeing the 4004 and being asked what microprocessors will be used for besides calculators in the future. Your answer would probably be off by a factor of a billion
Hitech Journal Nove 15, 2021
November 12th Canapa Shop Walk
814 horsepower 2019 McLaren Senna Can Am Car…and it’s street legal? Yes, though I doubt it will ever see a public thoroughfare.
The Misty Miss Christy
June Christy (1925-1990) was best known for her work in the cool jazz genre and for her silky smooth vocals. Her success as a singer began with The Stan Kenton Orchestra. She pursued a solo career from 1954 and is best known for her debut album Something Cool. After her death, she was hailed as "one of the finest and most neglected singers of her time.
In 1945, after hearing that Anita O'Day had left Stan Kenton's Orchestra, she auditioned and was chosen for the role as a vocalist. Her voice produced hits such as "Shoo Fly Pie and Apple Pan Dowdy", the million-selling "Tampico" in 1945, and "How High the Moon". "Tampico" was Kenton's biggest-selling record.
From 1947, she started to work on her own records, primarily with arranger and bandleader Pete Rugolo. [Rugolo had been Kenton’s premier arranger and sometime composer, whose ‘Theme and Variations’ stands in my mind as the greatest big band jazz piece ever.] In 1954, she released a 10" LP entitled Something Cool, recorded with Rugolo and his orchestra, a gathering of notable Los Angeles jazz musicians that included her husband, multi-instrumentalist Bob Cooper and alto saxophonist Bud Shank. Something Cool launched the vocal cool movement of the 1950s, and it hit the Top 20 Charts, as did her third album, The Misty Miss Christy.
Christy returned to the recording studio in 1977 to record her final solo LP, Impromptu. She recorded an interview in 1987 for a Paul Cacia produced album called "The Alumni Tribute to Stan Kenton" on the Happy Hour label. A number of other Kenton alumni (Shorty Rogers, Lee Konitz, Jack Sheldon, among them) interspersed their tunes with reminiscences of the man and the years on the road.
I’m Going to Love that Guy https://youtu.be/uIyt4KyaTQI
Something Cool https://youtu.be/jn8EtaxGJP0?t=1
Taking a Chance on Love https://youtu.be/rpKDock6REk?t=3
The Best of June Christy https://youtu.be/5XIuQ7rQ-1U
If you get the impression I like June Christy, you’ve got that one right. June with Stan Kenton was a game-changer in jazz vocals.
Light Pillar over Volcanic Etna
What happening above that volcano? Something very unusual -- a volcanic light pillar.
More typically, light pillars are caused by sunlight and so appear as a bright column that extends upward above a rising or setting Sun. Alternatively, other light pillars -- some quite colorful -- have been recorded above street and house lights.
This light pillar, though, was illuminated by the red light emitted by the glowing magma of an erupting volcano. The volcano is Italy's Mount Etna, and the featured image was captured with a single shot a few hours after sunset in mid-June.
Freezing temperatures above the volcano's ash cloud created ice-crystals either in cirrus clouds high above the volcano -- or in condensed water vapor expelled by Mount Etna. These ice crystals -- mostly flat toward the ground but fluttering -- then reflected away light from the volcano's caldera.
Lithium-Ion Batteries Made with Recycled Materials Can Outlast Newer Counterparts
Proving performance could boost battery manufacturers’ confidence in reused materials
Without recycling, manufacturers may run out of resources needed to make lithium-ion batteries for electric vehicles, smartphones and other devices in the decades to come.
drew angerer/getty images
By Carolyn Wilke
Lithium-ion batteries with recycled cathodes can outperform batteries with cathodes made from pristine materials, lasting for thousands of additional charging cycles, a study finds.
Growing demand for these batteries — which power devices from smartphones to electric vehicles — may outstrip the world’s supply of some crucial ingredients, such as cobalt (SN: 5/7/19). Ramping up recycling could help avert a potential shortage. But some manufacturers worry that impurities in recycled materials may cause battery performance to falter.
“Based on our study, recycled materials can perform as well as, or even better than, virgin materials,” says materials scientist Yan Wang of Worcester Polytechnic Institute in Massachusetts.
Using shredded spent batteries, Wang and colleagues extracted the electrodes and dissolved the metals from those battery bits in an acidic solution. By tweaking the solution’s pH, the team removed impurities such as iron and copper and recovered over 90 percent of three key metals: nickel, manganese and cobalt. The recovered metals formed the basis for the team’s cathode material.
In tests of how well batteries maintain their capacity to store energy after repeated use and recharging, batteries with recycled cathodes outperformed ones made with brand-new commercial materials of the same composition. It took 11,600 charging cycles for the batteries with recycled cathodes to lose 30 percent of their initial capacity. That’s about 50 percent better than the respectable 7,600 cycles for the batteries with new cathodes, the team reports October 15 in Joule. Those thousands of extra cycles could translate into years of better battery performance, Wang says.
The Dawn of Everything Rewrites 40,000 Years of Human History
A new book recasts social evolution as surprisingly varied
Social evolution, from Ice Age hunter-gatherer networks to ancient Egypt’s pyramid-building dynasties and beyond, gets reinterpreted in a new book as a series of flexible systems that didn’t inevitably produce rampant inequality.
By Bruce Bower
Concerns abound about what’s gone wrong in modern societies. Many scholars explain growing gaps between the haves and the have-nots as partly a by-product of living in dense, urban populations. The bigger the crowd, from this perspective, the more we need power brokers to run the show. Societies have scaled up for thousands of years, which has magnified the distance between the wealthy and those left wanting.
In The Dawn of Everything, anthropologist David Graeber and archaeologist David Wengrow challenge the assumption that bigger societies inevitably produce a range of inequalities. Using examples from past societies, the pair also rejects the popular idea that social evolution occurred in stages.
Such stages, according to conventional wisdom, began with humans living in small hunter-gatherer bands where everyone was on equal footing. Then an agricultural revolution about 12,000 years ago fueled population growth and the emergence of tribes, then chiefdoms and eventually bureaucratic states. Or perhaps murderous alpha males dominated ancient hunter-gatherer groups. If so, early states may have represented attempts to corral our selfish, violent natures.
Neither scenario makes sense to Graeber and Wengrow. Their research synthesis — which extends for 526 pages — paints a more hopeful picture of social life over the last 30,000 to 40,000 years. For most of that time, the authors argue, humans have tactically alternated between small and large social setups. Some social systems featured ruling elites, working stiffs and enslaved people. Others emphasized decentralized, collective decision making. Some were run by men, others by women. The big question — one the authors can’t yet answer — is why, after tens of thousands of years of social flexibility, many people today can’t conceive of how society might effectively be reorganized.
Hunter-gatherers have a long history of revamping social systems from one season to the next, the authors write. About a century ago, researchers observed that Indigenous populations in North America and elsewhere often operated in small, mobile groups for part of the year and crystallized into large, sedentary communities the rest of the year. For example, each winter, Canada’s Northwest Coast Kwakiutl hunter-gatherers built wooden structures where nobles ruled over designated commoners and enslaved people, and held banquets called potlatch. In summers, aristocratic courts disbanded, and clans with less formal social ranks fished along the coast.
Many Late Stone Age hunter-gatherers similarly assembled and dismantled social systems on a seasonal basis, evidence gathered over the last few decades suggests. Scattered discoveries of elaborate graves for apparently esteemed individuals (SN: 10/5/17) and huge structures made of stone (SN: 2/11/21), mammoth bones and other material dot Eurasian landscapes. The graves may hold individuals who were accorded special status, at least at times of the year when mobile groups formed large communities and built large structures, the authors speculate. Seasonal gatherings to conduct rituals and feasts probably occurred at the monumental sites. No signs of centralized power, such as palaces or storehouses, accompany those sites.
Social flexibility and experimentation, rather than a revolutionary shift, also characterized ancient transitions to agriculture, Graeber and Wengrow write. Middle Eastern village excavations now indicate that the domestication of cereals and other crops occurred in fits and starts from around 12,000 to 9,000 years ago. Ancient Fertile Crescent communities periodically gave farming a go while still hunting, foraging, fishing and trading. Early cultivators were in no rush to treat tracts of land as private property or to form political systems headed by kings, the authors conclude.
Even in early cities of Mesopotamia and Eurasia around 6,000 years ago (SN: 2/19/20), absolute rule by monarchs did not exist. Collective decisions were made by district councils and citizen assemblies, archaeological evidence suggests. In contrast, authoritarian, violent political systems appeared in the region’s mobile, nonagricultural populations at that time.
Early states formed in piecemeal fashion, the authors argue. These political systems incorporated one or more of three basic elements of domination: violent control of the masses by authorities, bureaucratic management of special knowledge and information, and public demonstrations of rulers’ power and charisma. Egypt’s early rulers more than 4,000 years ago fused violent coercion of their subjects with extensive bureaucratic controls over daily affairs. Classic Maya rulers in Central America 1,100 years ago or more relied on administrators to monitor cosmic events while grounding earthly power in violent control and alliances with other kings.
States can take many forms, though. Graeber and Wengrow point to Bronze Age Minoan society on Crete as an example of a political system run by priestesses who called on citizens to transcend individuality via ecstatic experiences that bound the population together.
What seems to have changed today is that basic social liberties have receded, the authors contend. The freedom to relocate to new kinds of communities, to disobey commands issued by others and to create new social systems or alternate between different ones has become a scarce commodity. Finding ways to reclaim that freedom is a major challenge.
These examples give just a taste of the geographic and historical ground covered by the authors. Shortly after finishing writing the book, Graeber, who died in 2020, tweeted: “My brain feels bruised with numb surprise.” That sense of revelation animates this provocative take on humankind’s social journey.
I’ve ordered this, and after I’ve cleared the 20-or-so books already on my table I’ll read it with my school-trained anthropologist eyes sufficiently dimmed by age to remove any academic taint, and report back…maybe.
Tesla…More than Car Batteries Get Amped Up
The band City Kidd was renamed Tesla during the recording of their first album, 1986's Mechanical Resonance, on the advice of their manager that City Kidd was not a great name (in addition, there was already another band going by that name).
The band derived their name, certain album and song titles, and some song content from events relating to inventor and electrical engineer Nikola Tesla.
Tesla's music is generally categorized as hair/glam metal heavy metal, and hard rock. The band's first two albums were recorded with a typical 1980s glam metal sound, but with Mechanical Resonance having some elements of hard rock, and The Great Radio Controversy having some elements of blues. Psychotic Supper marked a slight change to a more bluesy and acoustic sound, but with their traditional pop-metal sound staying.
Heaven’s Trail https://youtu.be/ZvFj-KPik0g?t=1
What You Give https://youtu.be/9vwHuCC6nP8?t=2
While promoting their album The Great Radio Controversy, the band participated in a canned food drive that allowed free concert admission to contributors, this event was incorporated into the video for "The Way It Is". In February 2005, Tesla headlined a benefit show at the PPAC in Providence, Rhode Island, for the victims of the Station nightclub fire. During the show the band auctioned off an autographed acoustic guitar, with the proceeds going to the Station Family Fund. 100% of the ticket sales also went to this charity.
In February 2008, Tesla helped fund and headlined a benefit concert for victims of the Station nightclub fire. The show was broadcast by VH1 Classic. Tesla played three songs: "What You Give", "Signs", and "Love Song", though "What You Give" did not make it onto the broadcast.
If a Dog Could Talk
Well Greg Sitek’s dog can, and she says the same things.
Quantum Computing and Quantum Supremacy, Explained
IBM and Google are racing to create a truly useful quantum computer. Here's what makes quantum computers different from normal computers and how they could change the world
Quantum computing could change the world. It could transform medicine, break encryption and revolutionize communications and artificial intelligence. Companies like IBM, Microsoft and Google are racing to build reliable quantum computers. China has invested billions.
Recently, Google claimed that it had achieved quantum supremacy – the first time a quantum computer has outperformed a traditional one. But what is quantum computing? And how does it work?
What is quantum computing?
Let’s start with the basics.
An ordinary computer chip uses bits. These are like tiny switches, that can either be in the off position – represented by a zero – or in the on position – represented by a one. Every app you use, website you visit and photograph you take is ultimately made up of millions of these bits in some combination of ones and zeroes.
This works great for most things, but it doesn’t reflect the way the universe actually works. In nature, things aren’t just on or off. They’re uncertain. And even our best supercomputers aren’t very good at dealing with uncertainty. That’s a problem.
That's because, over the last century, physicists have discovered when you go down to a really small scale, weird things start to happen. They’ve developed a whole new field of science to try and explain them. It’s called quantum mechanics.
Quantum mechanics is the foundation of physics, which underlies chemistry, which is the foundation of biology. So for scientists to accurately simulate any of those things, they need a better way of making calculations that can handle uncertainty. Enter, quantum computers.
How do quantum computers work?
Instead of bits, quantum computers use qubits. Rather than just being on or off, qubits can also be in what’s called ‘superposition’ – where they’re both on and off at the same time, or somewhere on a spectrum between the two.
Take a coin. If you flip it, it can either be heads or tails. But if you spin it – it’s got a chance of landing on heads, and a chance of landing on tails. Until you measure it, by stopping the coin, it can be either. Superposition is like a spinning coin, and it’s one of the things that makes quantum computers so powerful. A qubit allows for uncertainty.
If you ask a normal computer to figure its way out of a maze, it will try every single branch in turn, ruling them all out individually until it finds the right one. A quantum computer can go down every path of the maze at once. It can hold uncertainty in its head.
It’s a bit like keeping a finger in the pages of a choose your own adventure book. If your character dies, you can immediately choose a different path, instead of having to return to the start of the book.
The other thing that qubits can do is called entanglement. Normally, if you flip two coins, the result of one coin toss has no bearing on the result of the other one. They’re independent. In entanglement, two particles are linked together, even if they’re physically separate. If one comes up heads, the other one will also be heads.
It sounds like magic, and physicists still don’t fully understand how or why it works. But in the realm of quantum computing, it means that you can move information around, even if it contains uncertainty. You can take that spinning coin and use it to perform complex calculations. And if you can string together multiple qubits, you can tackle problems that would take our best computers millions of years to solve.
What can quantum computers do?
Quantum computers aren’t just about doing things faster or more efficiently. They’ll let us do things that we couldn’t even have dreamed of without them. Things that even the best supercomputer just isn’t capable of.
They have the potential to rapidly accelerate the development of artificial intelligence. Google is already using them to improve the software of self-driving cars. They’ll also be vital for modelling chemical reactions.
Right now, supercomputers can only analyze the most basic molecules. But quantum computers operate using the same quantum properties as the molecules they’re trying to simulate. They should have no problem handling even the most complicated reactions.
That could mean more efficient products – from new materials for batteries in electric cars, through to better and cheaper drugs, or vastly improved solar panels. Scientists hope that quantum simulations could even help find a cure for Alzheimer’s.
Quantum computers will find a use anywhere where there’s a large, uncertain complicated system that needs to be simulated. That could be anything from predicting the financial markets, to improving weather forecasts, to modelling the behaviour of individual electrons: using quantum computing to understand quantum physics.
Cryptography will be another key application. Right now, a lot of encryption systems rely on the difficulty of breaking down large numbers into prime numbers. This is called factoring, and for classical computers, it’s slow, expensive and impractical. But quantum computers can do it easily. And that could put our data at risk.
There are rumors that intelligence agencies across the world are already stockpiling vast amounts of encrypted data in the hope that they’ll soon have access to a quantum computer that can crack it.
The only way to fight back is with quantum encryption. This relies on the uncertainty principle – the idea that you can’t measure something without influencing the result. Quantum encryption keys could not be copied or hacked. They would be completely unbreakable.
When will I get a quantum computer?
You’ll probably never have a quantum chip in your laptop or smartphone. There’s not going to be an iPhone Q. Quantum computers have been theorized about for decades, but the reason it’s taken so long for them to arrive is that they’re incredibly sensitive to interference.
Almost anything can knock a qubit out of the delicate state of superposition. As a result, quantum computers have to be kept isolated from all forms of electrical interference, and chilled down to close to absolute zero. That’s colder than outer space.
They’ll mostly be used by academics and businesses, who will probably access them remotely. It’s already possible to use IBM’s quantum computer via its website – you can even play a card game with it.
But we still have a while to wait before quantum computers can do all the things they promise. Right now, the best quantum computers have about 50 qubits. That’s enough to make them incredibly powerful, because every qubit you add means an exponential increase in processing capacity. But they also have really high error rates, because of those problems with interference.
They’re powerful, but not reliable.
That means that for now, claims of quantum supremacy have to be taken with a pinch of salt. In October 2019, Google published a paper suggesting it had achieved quantum supremacy – the point at which a quantum computer can outperform a classical computer. But its rivals disputed the claim – IBM said Google had not tapped into the full power of modern supercomputers.
Most of the big breakthroughs so far have been in controlled settings, or using problems that we already know the answer to. In any case, reaching quantum supremacy doesn’t mean quantum computers are actually ready to do anything useful.
Researchers have made great progress in developing the algorithms that quantum computers will use. But the devices themselves still need a lot more work.
Quantum computing could change the world – but right now, its future remains uncertain.
This is a content preview space you can use to get your audience interested in what you have to say so they can’t wait to learn and read more. Pull out the most interesting detail that appears on the page and write it here.