Solar flares are violent explosions on the sun that fling out high-energy charged particles, sometimes toward Earth, where they disrupt communications and endanger satellites and astronauts.
But as scientists discovered in 1996, flares can also create seismic activity — sunquakes — releasing impulsive acoustic waves that penetrate deep into the sun’s interior.
While the relationship between solar flares and sunquakes is still a mystery, new findings suggest that these “acoustic transients” — and the surface ripples they generate — can tell us a lot about flares and may someday help us forecast their size and severity.
A team of physicists from the United States, Colombia and Australia has found that part of the acoustic energy released from a flare in 2011 emanated from about 1,000 kilometers beneath the solar surface — the photosphere — and, thus, far beneath the solar flare that triggered the quake.
TV news broadcasters, productions studios, and others are using technology from a Seattle startup to analyze their video content and make changes based on audience reception.
Resonance AI recently reeled in $2.28 million of a larger investment round to fuel growth of its video analysis platform that uses artificial intelligence to measure dialogue, music, mood, lighting, pacing, movement, and more.
The startup provides data that helps content creators figure out what resonates with an audience. It can answer questions such as: Who is my most valuable talent? What types of stories resonate by market? Is the editing of my show too fast-paced?
The 6-year-old startup is led by CEO and co-founder Tom Chiarella, a former exec at Statera and General Electric, as well as president and co-founder Randa Minkarah, a