How to photograph shock waves ?


This week NASA released the first-ever image of shock waves interacting between two supersonic aircraft. It’s a stunning effort, requiring a cutting-edge version of a century-old photographic technique and perfect coordination between three airplanes – the two supersonic Air Force T-38s and the NASA B-200 King Air that captured the image. The T-38s are flying in formation, roughly 30 ft apart, and the interaction of their shock waves is distinctly visible. The otherwise straight lines curve sharply near their intersections. 

Fully capturing this kind of behavior in ground-based tests or in computer simulation is incredibly difficult, and engineers will no doubt be studying and comparing every one of these images with those smaller-scale counterparts. NASA developed this system as part of their ongoing project for commercial supersonic technologies. (Image credit: NASA Armstrong; submitted by multiple readers)

How do these images get captured?

It may not obvious as to how this image was generated because if you have heard about Schlieren imaging what you have in your head is a setup that looks something like:


But how does Schelerin photography scale up to capturing moving objects in the sky?

Heat Haze

When viewing objects through the exhaust gases emanating from the nozzle of aircrafts, one can observe the image to be distorted.


Hot air is less dense than cold air.

And this creates a gradient in the refractive index of the air

Light gets bent/distorted


Method-01 : BOSCO ( Background-Oriented Schlieren using Celestial Objects )

You make the aircraft whose shock-wave that you would like to analyze pass across the sun in the sky.

You place a hydrogen alpha filter on your ground based telescope and observe this:


                  Notice the ripples that pass through the sunspots

The different air density caused by the aircraft bends the specific wavelength of light from the sun. This allows us to see the density gradient like the case of our heat wave above.

We can now calculate how far each “speckle” on the sun moved, and that gives us the following Schlieren image.

Method-02: Airborne Background Oriented Schlieren Technique

In the previous technique how far each speckle of the sun moved was used for imaging. BUT you can also use any textured background pattern in general.

An aircraft with camera flies above the flight like so:


The patterned ground now plays the role of the sun. Some versions of textures that are commonly are:


The difficulty in this method is the Image processing that follows after the images have been taken. 

And one of the main reasons why the image that NASA has released is spectacular because NASA seems to have nailed the underlying processing involved.

Have a great day!

* More on Heat hazes

** More on BOSCO

*** Images from the following paper : Airborne Application of the Background Oriented Schlieren Technique to a Helicopter in Forward Flight

**** This post obviously oversimplifies the technique. A lot of research goes into the processing of these images. But the motive of the post was to give you an idea of the method used to capture the image, the underlying science goes much deeper than this post.


Screens, Lasers and Symmetry

A couple of weeks ago, we discussed about the famous Photograph-51 and how
that led to the discovery of the Double Helix structure of DNA.

mentioned in that post that the best way to visualize that diffraction
pattern is by using a laser and pointing it on a helix from a ballpoint


And in the previous post on pixels, we learned about how the RGB pixels arranged on a screen come together to render those beautiful images on your screen.


                                        Source : Microworld

The pixel arrangement on a screen need not be periodic like shown above. In fact ,most manufacturers have their own unique type of representation ( see below )and the type varies with the type of application as well.


As an amateur physicist you do not have a microscope but only a green laser as your tool, how would you go about finding which one of these arrangement your smartphone has ?

Visualizing pixel spacing using a LASER

For a fact, you know that:

if you shine a red light on a green or blue object, it will
appear black.



So if you take your green laser pointer and shine it on any of those pixel blocks, you know that you are only going to get green light from the green filter.


The other two filters will absorb the green light.

And using that you can find out the type of pixel arrangement your smartphone has.

We will be testing it out with Samsung Galaxy S4 whose  pixel arrangement on the screen looks like so:


Notice the oval nature of the green dots.

Let’s shine a green laser on the screen observe the resulting diffraction  pattern:


The diffraction pattern that you obtain is the following:


Observe that the dots on the image are not circles but ovals instead. This is due to the nature of the pixel arrangement on the Galaxy S4.

If you had a good red laser (which we did not) and tried this same experiment, you would get a pattern like so:


You are also welcome to try it on a smartphone of your choice or any electronic display and compare it with the pixel arrangement of that particular device.


This paper (from which the above image has been taken) runs through some more examples of the diffraction pattern that one obtains from common electronic components.

Have fun!

Related Interesting videos:

LCD Technology: How it Works

How a TV Works in Slow Motion – The Slow Mo Guys

* As with any diffraction pattern, you can measure the distance between the two dots and calculate the distance between two consequent pixels using the wavelength of the light source as given.

Need for multiple slits

In the previous post we discussed why physicists loved to talk about wavelengths and how the wavelength of a light could be found easily by measuring the angle(theta) between the highest intensity (zero order) and say the ‘nth’ order in a Diffraction grating.


If you have heard about the famous Young’s double slit experiment, a Diffraction grating is a tool which has N slits instead of two.

But why do we need N slits ?

Let’s look at various plot of the intensity of light produced at the screen with different slit configurations to gain a perspective.


Notice how the how the width of the bright spots become narrower as we increase the number of slits:


And the following is how the intensity of light(I) changes with each configuration:


Why is this important to have narrow regions of brightness ?

When you actually perform a double slit experiment, here’s what you would see IRL:


                                     PC: artemiawatson on Tumblr

Yes, you can see the interference pattern, but that looks awful if you want to measure the distance between two peaks. The whole pattern looks so diffused!

We would like a nice and crisp peak in intensity so that we wouldn’t need to work hard to measure the distance between the peak intensities. BUT we just observed previously that just increasing the number of slits seems to get the job done!

So let’s increase the number of slits! A tool that does this is known as a  diffraction grating.


                           Diffraction Grating of different slit sizes

Using one of these gratings, we get the following crisp image of the peaks.


We can now take a ruler and measure the distance between the first maximum and the rest. This makes the measurement of wavelength of light even more easier.

Have a good one!

EDIT: If this still feels bizarre and you would like a more visual approach to Diffraction Gratings, we refer you to this MIT video by Shaoul Ezekiel.

Why is wavelength important than frequency ?

Whenever you see physicists talking about light, you might have noticed they prefer to use wavelength of the light rather than it’s frequency.


This is not a slip of the tongue and there is a very simple reason to it.

It is convenient to measure the wavelength of light experimentally rather than its frequency.

Take the violet light of wavelength 400nm. If we calculate it’s frequency, it turns out to be:


Why is this a problem?

Can’t we measure 7.5 x 10^14 Hz directly ?* There is a theorem by Nyquist in signal processing which states that:

minimum rate at which a signal can be sampled without introducing
errors, is twice the highest frequency present in the signal.

This means that if you want to measure the frequency of light accurately then you need to be sampling at 2*(7.5 x 10^14) Hz in order to measure it and this is incredibly hard to achieve this instrumentally!

Diffraction Grating

On the other hand, here is how easy it is to measure the wavelength of the light:

Take the source of light and pass it through a diffraction grating.


Measure the angle(theta) between the highest intensity (zero order) and say the ‘nth’ order. (see diagram above).

Use the following formula for the wavelength : **


where, d –  distance between the slits (will be provided by manufacturer of diffraction slit), n – order of the slit, theta- from measurement.

And voila, you have the wavelength of the light. That’s how simple it is to get the wavelength of a source light. Since speed of light is a constant, the frequency of light is found out from the following relation:


In addition to this, you can also derive the energy of a photon using the relation:


And so on and so forth. All of these following from a simple diffraction experiment! That’s why calculating the wavelength of light is so crucial.

Have a good one!

* There are some indirect means to do this. Check this physicsforum page for more.

** How do diffraction gratings work ?

Heeeey! How does the laser and chalk dust thingy work? Why does the laser reflects like that? I want to share it with my old physics teacher for her new students.


What is a Colloidal solution?

homogeneous, noncrystalline substance consisting of large molecules or
ultra microscopic particles of one substance dispersed through a second

Colloids include gels, sols, and emulsions; the particles do
not settle and cannot be separated out by ordinary filtering or
centrifuging like those in a suspension.

You need the particles in the solution to scatter off the light that is
passing through them in order to see the path taken by light. This goes
by the name of Tyndall effect

It is related to  Rayleigh scattering in that Rayleigh scattering requires the light-scattering particles to be far
smaller than the wavelength of the light whereas Tyndall scattering
involves bigger particle sizes.

Even if you don’t have a laser light, you can witness this phenomenon with a torch light and having milk in a glass. (Milk is a colloid).





One can also turn this procedure down on its head and use this to find whether a mixture is a true solution or a colloid too! Lots of fun!

Thanks for asking !

** More examples of Tyndall effect

Take a glass bowl and fill it with water.

Sprinkle some chalk dust on it.

Realize cool refraction experiments using a laser pointer as source.

There is nothing like a simple DIY experiment to clearly understand the principles of physics.

Have fun!

Shadows are not always black!

Shadows are absolutely
fascinating to play around with.

In this Exploratorium demonstration
you can see that a black shadow is only a subset of shadows that can be
formed on the screen.


If you have multiple sources of light with different colors, then you can additively combine colors to get shadows of various shades.


In this case where you have multiple sources of light, only when the object blocks off light from all three colored light sources do you get a black shadow.

Since we do not deal with multiple colored sources of light on a daily basis, go ahead, give this simple experiment a try. It’s totally worth it!

Have a good one!

** Other FYP explorations on Shadows :

Can a single point of light always illuminate an entire room ?

On disappearing shadows of Birds and airplanes


If the size of raindrops are uniformly small *(~0.5mm);  nature treats you with these exotic green, pink and purple fringes inside the bright primary rainbow, known as Supernumerary rainbows.


We were able to observe these rainbows yesterday but it lasted only
for a couple of minutes. The above images have been corrected for
saturation in order to make the fringes predominant.

What makes these supernumerary rainbows really interesting is that you can’t use geometric optics to explain their formation and are forced to acknowledge the wave nature of light.

To know about the physics behind Supernumerary Rainbows, click here and explore. It’s truly amazing!

* When raindrops have different sizes, the differently spaced fringes overlap to a blur.

Why do pilots use non polarized sunglasses ?


Polarized lenses are not recommended for use in the aviation

While useful for blocking reflected light from horizontal
surfaces such as water or snow,


polarization can reduce or eliminate the
visibility of instruments that incorporate anti-glare filters.


Polarized lenses may also interfere with visibility through an aircraft
windscreen by enhancing striations in laminated materials (known as photoelasticity)


     Photoelastic visualization of contact stresses on a marble in a C-clamp.


and mask the
sparkle of light that reflects off shiny surfaces such as another
aircraft’s wing or windscreen, which can reduce the time a pilot has to
react in a “see-and-avoid” traffic situation.


*Source:  Polarized v/s non polarized cockpit images

To light a candle is to cast a shadow,

Shadows are fascinating to all living creatures; the subtle fear and curiousness of an entity that always follows you around everywhere you go  makes shadows extremely thrilling to explore.

How are shadows formed ?

Shadow is just a region where no light of rays are able to enter. 


                             Relative sizes of the planets and the sun

When one pictures the sun one might be tempted to think (from its small apparent size in our skies) that it as a point source of light, but the size of the sun is HUGE!

If it were only a point source of light, one would never get the familiar umbras and penumbras that we see during an eclipse.



In addition to the Umbras and Penumbras, there is another classification to these shadows – Antumbras


If you were standing in the Antumbral region, you would experience an annular eclipse, in which a bright ring is visible around the eclipsing body

This is NOT the ‘shadow’ of the ISS



                                Photo credit: NASA/Joel Kowsky

As you can see from the ray diagram, the ISS DOES cast a shadow and if you were on the ISS you would most certainly see it (orange box – umbra region)


what you are witnessing from Earth in that amazing ISS photobomb of the 2017 Solar
eclipse is NOT a shadow but just the outline of the ISS against the
surface of the sun: An annular eclipse of the ISS.

On disappearing shadows of Birds and airplanes

The shadows of airplanes and birds when they are closer to the land are definitely a common sight.



But it seems as if that when these birds and airplanes are flying way up in the sky, they are not casting any shadows to the ground !

Although it is incorrect to say that they don’t cast shadows at all, it is true that their ‘shadows’ never reach the ground. Here’s an illustration:



How does one place a screen near an object flying 30000ft above the ground to witness the shadows?

Well, the clouds in the sky can most certainly act as screens to capture the shadows of the airplane when its flying high up in the sky. ( This is an optical phenomenon known as ‘Glory’. )


Shadows are just breathtaking, aren’t they?