Optical satellite detection with Python or how to find all SpaceX #Transporter2 mission satellites with a Sony A7S camera

This is the next step of the tutorial series that started with „Can a GoPro camera be used to optically track satellites?„. Due to the fact we found out the GoPro cannot be controlled from a PC with Gphoto or a similar software, the focus shifted to other cameras. Luckily a nice community member at the SeeSat mailinglist posted a video of the SpaceX Transporter2 mission that launched 30th of June 2021. he recorded it with a Sony Alpha A7S (first mark) and, without any exagration, it was a great video showing many of the 88 satellites the SpaceX Falcon9 did release.

So this new tutorial is showing what you can do with such low light astrophoto videos. It will show how simple Digital Image Processing (DIP) functions will help you detect star-like objects such as satellites. Also it shows how slow mowing stars and bright clouds will be compensated to allow faster moving objects to be detected.

Basically it should work with other cameras as well, but here it was just tested with the Sony A7S which is known for her low light capabilities. Unfortunetaly, the A7S ii and iii are said to eat stars. So if you test it with your camera, please leave a comment here.

So, let’s start with what we did and let’s find the many satellites of the SpaceX trasporter2 mission that were visible the night of the 1st of July near London UK…

  • Astrophoto video
  • Pre-Processing
  • Python code
    • Difference of two consecutive frames
    • Keeping the image parts over a threshhold
    • closing the holes in the pixel clusters
    • Dilation and erosion of the pixel clusters
    • Clustering the pixel clusters
    • Find star like objects at the locations of the clusters
    • Finding the nearest cluster in the next, and next next frame
    • Detecting of objects that are found in 3 frames with the expected speed and direction
  • Single photos to video

0. Astrophoto video from the night sky

1. Pre-Processing

The original video is pre-processed with FFMPEG to stretch the contrast. This is done because the starlike objects we are looking for are faint and . For that we used the „colorlevels“ function and best resulst we had with the max levels of 50% (0.5) for each RGB color channel.

We used the Windows version from FFMPEG from https://ffmpeg.org/download.html#build-windows but the Linux versions will do the same.

ffmpeg.exe -ss 00:00:00 -i VIDEOFILE.MP4 -crf 25 -pix_fmt yuv420p -vf "colorlevels=rimax=0.50:gimax=0.50:bimax=0.50" VIDEOOUT.MP4

2. Python code

https://github.com/aerospaceresearch/satobserve

3. Single photos to video

As an extra step the single images are bundled as a video also with FFMPEG.

ffmpeg.exe -r 60 -f image2 -s 1920x1080 -start_number 6764 -i box_%d.jpg -vcodec libx264 -crf 15  -pix_fmt yuv420p test1.mp4