Automating the characterization of the visual brightness of Starlink Satellites …and any other satellite

Maybe you have already spotted them yourself on the night sky. Most likely you have heared about them in the news. Since 2019, SpaceX is launching hundreds and hundreds of Starlink satellites into Earth orbit. In the beginning they were very brightly and clearly visible so that people reported this unusual appearence like in the video from Marco LANGBROEK all over the world. These train like formations were not used by other satellites so people were used to anything else than 50 and more bright spots in a long row flying over the firmament. If they gazed into the night sky maybe a few single satellites were visible at the same time within a human’s perceptional field-of-view. So people were used to single satellites above us and bright pearl chains in the sky rose attention.

Video by Marco LANGBROEK

These formations in the sky not only rose attention of the public, but also by the scientific community. What is visible and nicely looking for humans, will be a challenge for any professional observation. By the sheer high number of satellites, many of them will be on photos taken by observatories. That results in disturbances of the pictures of what should have been a recording of something in deep space and now many satellites are blocking the view. The picture becomes unusable for science. The next photos shows you the effects on the Blanco telescope’s sensor.

Signal pollution in a 333-second exposure image taken from the Blanco four-meter (13′) telescope at the Cerro Tololo Inter-American Observatory. (Source: Wikipedia)

Around 2019, the following code is summarizing the situation. And since then SpaceX worked to improve the situation by working together with the scientific community to change the designs of the satellites to reduce the visual impact on observations.

The potential tragedy of a mega-constellation like Starlink is that for the rest of humanity it changes how the night sky looks,” says Ronald Drimmel from the Turin Astrophysical Observatory in Italy. “Starlink, and other mega constellations, would ruin the sky for everyone on the planet.

Ronald Drimmel from the Turin Astrophysical Observatory in Italy. From „SpaceX’s Starlink Could Change The Night Sky Forever, And Astronomers Are Not Happy“ on (2019-05-27T07:42 EDT)

So this how-to shows you how you can measure the apparant brightness of Starlink and other satellites in a simple way. Furthermore the measurements can help the observation community to see effects that are the results of such design changes.

1. Preparation

For taking part in this endeavour, you will need some hardware, some accounts and a bit of software. Before we will explain in more details what to do you find a condensed list of the different things you need for your satellite brightness measurements here below. It is sorted according to the different aspects.

1.1 Observation camera

You will obviously need something to record your astrovideos. That means your equipment need to be able to record videos on which stars and faint satellites can be seen. The listed Sony setup worked flawlessly. It should also work with other cameras. So pleas check here if your system at hand will work…

  • have a Sony Alpha A7s-i, or any other camera that allows astrovideos in 1080p25
  • use a Sony FE 1.8/50 lense or equivalent for your camera. You need some good f-number for a lot of light on the camera sensor. And also the field-of-view with such a lens is good fo our purpose
  • have a tripod and ball head for your camera that allows good view towards the sky and a firm fix.

1.2 Analysis hardware

When you have a recorded astrovideo with stars, satellites and everything, you will need some hardware to analyse the video to extract the brightness of all satellites that you observed.
The computer with python is used for the main process flow of video analysis. The Raspberry Pi 4 is used only for extracting the stars. We were not able to get to run on a windows system, so we send the frame to the raspi and receive everything we need from there. If you manage to install it on windows, please tell us. If you have a linux only system, then you can try to have both on one linux computer.
So far, our setup was like this…

  • have a rather decent laptop or computer with python for the overall analysis
  • get a spare Raspberry Pi 4 (ram >= 2gb) that you most likley have lying around and we will use here for

1.3 Accounts to-have and should-have

We need a few accounts to be able to work and to report our results back to the community.
And further more there are 2 optional accounts that we like you to know because they could help you.

  • create an account at, for the orbit data of satellites
  • subscribe to the SeeSat-L mailinglist on to discuss and share your results
  • (optional) subscribe to our zulip chat where you can ask me anything
  • (optional) have an account at to say „thank you“ for their software

1.4 Location for satellite spotting

Welcome to be a satellite spotter! You will need to be outside when the weather allows it to view stars and satellites.

  • find a good and clear-sky location for your astrophotos/videos and know the geo-coordinates. We used a nice field outside of the City of Jena, which is fittingly called „city of light“, next to an airfield. So the sky view is good, it has less light pollution, and it is not that creepy to be outside there alone in the dark. Find a location as it suits you.

1.5 Optional syncronization hardware

This is part of our lessons learned. You don’t need it, but it makes your life easier. This is for a simple hardware that synchronized your camera to an external GPS. The Sony A7S-i does not have a built-in GPS, and the manual date&time setup proved to be unreasonably off that finding the real start time of the video took some time, for each video.
So this shall help to auomate it. Keep this in mind when you read until the end.

  • Raspberry Pi Pico (any, we use a W here)
  • Adafruit GPS Ultimate Breakout (any other that provides 1pps signal should do)
  • prototype soldering board, LED, pre-resistor and soldering stuff
  • PeakDesign dual plate or any other camera adapter that mounts to your tripod

2. What is this good for or how do I help here?

Visual Brightness Characteristics of Starlink Generation 1 Satellites“ by Anthony Mallama and Jay Respler on

3. How it was done before, manually

The Method of Visual Satellite Photometry“ by Anthony Mallama on

4. How it is done here, almost fully automized


4.1 Capture the sky with your camera in 1080p25 video. Crank up you ISO!

4.2 Satellite knowledge–2022-06-21/format/3le (Please log in or link does not work! Please use responsibly!)

Within the above link, there are three sections for us to change according to our observation. So have a look into the box below and at least modify the EPOCH section to your observation time.

NORAD_CAT_ID/>1, narrow down to your satellites, or take all as we did
EPOCH/FROM-TO, in YYYY-MM-DD format, we used +-3 days around the observation date
format/3le, keep 3 line format, so you have the name of satellites

4.3 Celestial position kowledge of your camer’s Field-of-View

Install on your Raspberry Pi 4. That will take a while.

# install on your Raspberry Pi 4
sudo apt-get install -y

When is installed, you need to set it up for your camera. needs so called „index files“ that provide the settings for diferent lens sizes. Our Sony FE1.8/50 has a field-of-view (FoV) of about 45° and for this kind of lense, the „4100“ index file set is used. files of a lens with about 45° FoV. If you have a a smaller FoV for a tele-lense, than download the other index files, too.

When downloaded, create a local folder where you unpack the files. We placed them at „/home/pi/astrometry/index“. You need to modify the astrometry.cfg config file to point to your folder.

Open the config file:

sudo nano /etc/astrometry.cfg

Find this line and change it like this:

#add_path /usr/share/astrometry
add_path /home/pi/astrometry/index

4.4 Geographical coordinate position kowledge of your observation location

4.5 Time of observation knowledge of your video start, or how to sync your video to reality

5. Results, look what we found!

6. Conclusions and how to improve this