
02 May 2025
Tags: solex jsolex solar astronomy
I’m happy to announce the release of JSol’Ex 3.1, which ships with a long awaited feature: jagged edges correction! Let’s explore in this article what this is about.
Spectroheliographs like the Sol’Ex or the Sunscan are not using a traditional imaging system like, for example, in planetary imaging, where you can capture dozens to hundreds of frames per second and do the so called "lucky imaging" to get the best frames and stack them together to get a high resolution image.
In the case of a spectroheliograph, the image is built by scanning the solar disk in a series of "slices" of the sun: it takes several seconds and sometimes minutes (~3 minutes when you let the sun pass "naturally" through the slit) to get a full image of the sun.
In practice, this means that between each frame, each "slice" of the sun, the atmosphere will have slightly moved, causing some misalignment between the frames. This is also particularly visible when there is some wind, which can cause the telescope to shake a bit, and the image to be misaligned. Lastly, you may even have a mount which is not perfectly balanced, or which has some resonance at certain scan speeds.
As an illustration, let’s take this image captured using a Sunscan (courtesy of Oscar Canales):
This image shows 3 problems:
the jagged edges, which cause some unpleasant "spikes" on the edges of the sun
misalignment of features of the sun, particularly visible on filaments
a disk which isn’t perfectly round
These issues are typical of spectroheliographs, and are the main limiting factor when it comes to achieving high resolution images. Therefore, excellent seeing conditions are a must to get high quality images. Even if you do stacking, the fact that the reference image will show spikes is often a problem.
Starting with release 3.1.0, JSol’Ex ships with an experimental feature to correct jagged edges. It is not perfect yet, but good enough for you to provide feedback and even improve the quality of your images.
For example, here’s the same image, but with jagged edges correction applied:
And so that it’s even easier to see the difference, here’s a blinking animation of the two images:
The jagged edges are now mostly gone, the features in the sun are better aligned, and the image is much more pleasant to look at. There is still some jagging visible, the correction will never be perfect, but it is a good start.
In particular, you should be careful when applying the correction, because it could cause some artifacts in the image, in particular on prominences. As usual, with great powers comes great responsibilities!
To illustrate how the correction works, let’s imagine a perfect scan: a scan speed giving us a perfectly circular disk, no turbulence, no wind, etc.
In this case, what we would see during the scan is a spectrum which width slowly increases, reaches a maximum, and then decreases. The pace at which the width increases and decreases is determined by the scan speed and is predictable. In particular, the left and right borders of the spectrum will follow a circular curve.
Now, let’s get back to a "real world" scan. In that case, the left and right edges will slightly deviate from the circular curve. They will also follow the path of an ellipse: in fact, this ellipse is already required in order to perform geometric correction.
The idea is therefore quite simple in theory: we need to detect the left and right edges of the spectrum, then compare them to the ideal ellipse that we have computed. Pixels which deviate from this curve give us an information about the jagged edges. We can then compute a distortion map, which will be used to correct the image.
In practice, we also need to apply some filtering of samples: in practice, while the detection of edges is robust enough to provide us with a good geometric correction, it is not perfect. It can also be skewed by the presence of proms for example. Therefore, we are performing a sigma clipping on the detected edges, in order to remove outliers, that is to say pixels which deviate too much from the average deviation.
This is also why the correction will not work properly if the image is not focused correctly: you would combine two problems in one, and the correction would not be able to detect the edges properly.
In addition, in the image above you can see that the bottom most prominence is slightly distorted, which is caused by the fact that it’s far away from the 2 points which were used to compute the distortion. It may be possible to reduce such artifacts by using a smaller sigma factor (at the risk of undercorrecting edges).
In this blog post, I have described the new jagged edges correction feature in JSol’Ex 3.1. This solves one of the most common issues users are having with spectroheliographs, and I hope it will help you get better images. However, as usual, it’s a work in progress, so do not hesitate to provide feedback!
14 April 2025
Tags: solex jsolex solar astronomy
After dozens of hours of work, I’m happy to announce the release of JSol’Ex 3.0.0! This major release is a new milestone in the development of JSol’Ex, and it brings new features and improvements that I hope you will enjoy.
Since its inception as an educational project for understanding how the Sol’Ex works, JSol’Ex has grown into a powerful tool for processing and analyzing images captured with the Sol’Ex. However, it became very popular over time and started to be used outside the sole Sol’Ex community. In particular, it is now a tool of choice for many spectroheliographs owners.
I have always been keen on providing a user-friendly interface while keeping a good innovation pace. JSol’Ex was the first SHG software to offer:
automatic colorization of images
automatic detection of spectral lines
Doppler eclipse image, inverted image and orientation grid
automatic correction of the P angle
single click processing of Helium line images
embedded stacking
automatic trimming and compression of SER files
identifying what frame of a SER file matches a particular point of the solar disk
an optimal exposure calculator
automatic detection of redshifts
automatic detection and annotation of sunspots
automatic creation of animations of a single image taken at different wavelengths
a full-fledged scripting engine which allows creation of custom images, animations, etc.
support for home-made SHGs
and more!
All integrated into a single, easy to use, cross-platform application: no need for Gimp, ImPPG or Autostakkert! (but you can use them if you want to!).
For this new release, I wondered if I should change the name so that it better matches the new scope of the project, but eventually decided to keep it as it is, because it is already well known in the community and that changing it also implies significant amount of time spent on this that wouldn’t go into the new features.
In addition to performance improvements and bugfixes, this release deserves its major version number because of many significant improvements.
The first thing you may notice is the improved image quality. The algorithm to detect the spectral lines have been improved, which will result in a better polynomial detection and therefore a more accurate image reconstruction. This will be noticeable in images which have low signal, which is often the case in calcium.
Next, a new background removal algorithm has been added. It is fairly common to have either internal reflections or light leaks in the optical path of a spectroheliograph. This results in images which are hard to process or not usable at all. This version of JSol’Ex is capable of removing difficult gradients. To illustrate this, here’s an image that a user with a Sunscan sent me:
The image on the left is unprocessed and shows important internal reflections. These are completely removed in the image on the right, processed automatically with JSol’Ex.
This background removal will only be applied to the "Autostretch" image, which is the default "enhanced" image that JSol’Ex is using, but it is also available as a standalone function in ImageMath scripts.
Another common issue with SHGs is the presence of vignetting, visible on the poles of the solar disk. The vignetting issue stems from the following factors, in the order of their impact:
the physical size of the SHG’s optical components — including the lens diameter, grating size, and slit length
the telescope’s focal ratio and focal length
the telescope’s own intrinsic vignetting (though this is rarely a significant factor)
For prebuilt SHGs like the MLAstro SHG 700, the size of the lens and grating is typically constrained by the housing design and cost limitations. As a result, vignetting often becomes an issue when using longer focal length telescopes,—especially when paired with a longer slit.
To fix this, JSol’Ex had until now the option to use artificial flat correction: the idea was basicaly to model the illumination of the solar disk via a polynomial and to apply a correction to the image. This works relatively well, but it can sometimes introduce some noise, or even bias the reconstruction on low-contrast images. On even longer slits, this artificial correction is not sufficient to remove the vignetting, so JSol’Ex 3 introduces the ability to use a physical flat correction.
The idea with a physical flat correction is to take a series of 10 to 20 images of the sun, using a light diffuser device at the entrance of the telescope, such as tracing paper, in order to diffuse light. The flat should be captured with the same cropping window as the one used for the solar images, but exposure will be longer, and possibly higher gain as well. The result is a SER file that JSol’Ex can use to create a model of the illumination of the disk, which can be used to correct the images.
As an illustration, here’s a series of 3 images of the Sun, taken with a prototype of a 10mm slit:
The image on the left is done without any correction and shows very strong vignetting. The image in the middle is done with the artificial flat correction, improves the situation, but still shows some vignetting. The image on the right is done with the physical flat correction, which is much better and shows no vignetting at all.
Flats can be reused between sessions, as long as you use the same cropping window and the same wavelength.
The physical flat correction can also be used on images taken with a Sol’Ex, in particular for some wavelengths like H-beta which show stronger illumination of the middle of the solar disk.
Note
|
Flat correction is not designed to fix transversalliums: it has to apply low pass filtering to the image to compute a good flat, which will remove the transverse lines. To correct transversalliums, use the banding correction parameters. |
By default, JSol’Ex used to display images applying a linear stretch. Starting with this version, it is possible to select which stretching algorithm to use: linear, curve or no stretching at all.
This version introduces a new tool to measure distances! This feature was suggested by Minh Nguyen from MLAstro, after seeing one of my images in Calcium H, which showed a very long filament:
This tool lets you click on waypoints to follow a path and make measurements on the disk, in which case the distances take the curvature into account, or outside the disk, for example to measure the size of prominences, in which case the distances are linear.
The measured distances are always an approximation, because it’s basically impossible to know at what height a particular feature is located, but it gives a good rough estimate.
Last but not least, this version significantly improves the scripting engine, aka ImageMath. While this feature is for more advanced users, it is an extremely powerful tool which lets you generate custom images, automatically stack images, create animations, etc.
In this version, the scripting engine has been rewritten to make it more enjoyable to use. It adds:
the ability to write expressions on several lines
the possibility to use named parameters
the ability to define your own functions
call an external web service to generate script snippets
the ability to import scripts into other scripts
As well as new functions. Let’s take a deeper look.
You may have faced the situation where you wanted to apply the same operation to several images. For example, let’s imagine that you want to decorate an image with the observation details and the solar parameters.
Before, you would write something like this:
image1=draw_solar_params(draw_obs_details(some_image)
image2=draw_solar_params(draw_obs_details(some_other_image)
Now, you can define a function, let’s call it decorate
, which will take an image and return the decorated image:
[fun:decorate img]
result = draw_solar_params(draw_obs_details(img))
[outputs]
image1=decorate(some_image)
image2=decorate(some_other_image)
You can take a look at the documentation for more details.
In the previous section we have seen how to define functions.
It can be useful to externalize these functions in a separate file, so that they can be reused in other scripts.
This is now possible with the import
statement.
For example, let’s say you have a file called utils.math
which contains the decorate
function.
We can now import this file in our script:
[include "utils"]
[outputs]
image1=decorate(some_image)
image2=decorate(some_other_image)
This will import the utils.math
file and make the decorate
function available in the current script.
Named parameters are a new feature that allows you to pass parameters to functions by name, instead of by position. This is particularly useful for functions that take a lot of parameters, or when you want to make your code more readable.
For example, in the example above, we could have written:
[include "utils"]
[outputs]
image1=decorate(img: some_image)
image2=decorate(img: some_other_image)
The names of the parameters are documented here.
This version introduces a few new functions, which are available in the scripting engine:
bg_model
: background sky modeling
a2px
and px2a
: conversion between pixels and Angstroms
wavelen
: returns the wavelength of an image, based on its pixel shift, dispersion, and reference wavelength
remote_scriptgen
: allows calling an external web service to generate a script or images
transition
: creates a transition between two or more images
curve_transform
: applies a transformation to the image based on a curve
equalize
: equalizes the histograms of a series of images so that they look similar in brightness and contrast
And others have been improved:
find_shift
: added an optional parameter for the reference wavelength
continuum
: improved function reliability, enhancing Helium line extraction
The transition
function, for example, is capable of generating intermediate frames in an animation, based on the actual difference of time between two images, offering the ability to have smooth, uniform transitions between images.
This is how my partial solar eclipse animation was created!
I would like to thank all the users who have contributed to this release by reporting bugs, suggesting features, and testing the software. In particular, I would like to recognize the following people:
Minh Nguyen, MLAstro’s founder for his help with the background removal and flat correction algorithms, as well as the new distance measurement tool and review of this blog post
Yves Robin for his testing and improvement ideas
my wife for her patience, while I was going to bed late every night to work on this release
30 March 2025
Tags: solex jsolex solar astronomy
In this blog post I’m describing what is probably a world premiere (let me know if not!): capturing a partial solar eclipse using a spectroheliograph and making an animation which covers the whole event.
Edit: Turns out Olivier Aguerre did something similar, described here (french).
On March 29, 2025, we were lucky to get a partial solar eclipse visible in France, with a maximum of about 25%. I wanted to do what was a first for me, capturing the event, so I used a TS-Optics 80mm refractor with a 560mm focal length, equipped with an MLAstro SHG 700 spectroheliograph. The Astro Club Challandais was organizing a group observation this morning, but my initial decision was not to attend. Instead, I opted to limit the risks by performing this somewhat complex setup at home, in familiar territory. Unlike the SUNSCAN, using a spectroheliograph like the SHG 700 requires more equipment: a telescope, a mount (AZ-EQ6), and in my case, a mini PC for data acquisition (running Windows) along with a laptop for remote connection to the PC.
I have conducted observations away from home before, but from experience, setting everything up—including WiFi, polar alignment, etc.—can be a bit too risky for an event like this. So, all week, I anxiously monitored the weather. Yesterday, the forecast looked grim, with thick clouds and rain. However, the gods of astronomy were merciful, blessing us with a beautiful day. The sky wasn’t entirely clear, but it was good enough for observations.
First of all, I had a specific observation protocol in mind. As you may know, a spectroheliograph doesn’t directly produce an image—it requires software to process video scans of the Sun. In the case of the Sunscan, the software is built into the device, but for a Sol’Ex-type setup, an independent software handles this task. You are probably familiar with INTI, but I have my own software: JSol’Ex.
The advantage of developing my own software is that I was able to anticipate potential issues. One major challenge with an eclipse is that the software must "recognize" the Sun’s outline, which won’t always be perfectly round—in fact, it could be quite elliptical. The software corrects the image by detecting the edges, but when the Moon moves in front, the sampling points become completely incorrect, sometimes detecting the lunar limb instead of the Sun’s!
My strategy was to start early enough to adjust settings for minimal camera tilt and, more importantly, to ensure an X/Y ratio of 1.0. With these reference scans, I could then force all subsequent scans to use the same parameters. So, I began my first scans under a beautifully clear sky! After some adjustments, I was ready: a single scan, and we were good to go!
At the same time, I activated JSol’Ex’s integrated web server and set up a tunnel so my friends could watch my observations live! I planned to perform a scan every two minutes using a Python script in SharpCap, automating the recording, scan start, stop, and rewind. JSol’Ex’s "continuous" mode processed scans in real time. Everything was going smoothly… until panic struck—clouds!
For the past three hours, the sky had been perfectly clear. Yet, just ten minutes before the eclipse began, clouds started rolling in. What bad luck! Fortunately, by spacing out the scans, I managed to capture many of them in cloud-free moments.
The eclipse began, and the first scans featuring the Moon appeared. It worked! Forcing the X/Y ratio was effective!
As the scans piled up, I encountered a new problem. While locking the X/Y ratio helped, the software still needed to calculate an ellipse to determine the Sun’s center for cropping. But things started going wrong—the software was miscalculating everything. I had anticipated this, and I already had a workaround in mind, but the necessary code wasn’t deployed on my mini PC. So, I didn’t worry too much and simply shared the raw images, which were perfectly round—because, as you recall, I had already adjusted my X/Y ratio to 1.0!
I continued scanning, though my setup wasn’t perfectly precise. My polar alignment wasn’t flawless, and I had no millimeter-accurate return-to-start positioning. As a result, I had to manually realign between each scan. While this wasn’t a major issue, it did create some intriguing scans. For those familiar with Sol’Ex, seeing "gaps" in the spectrum due to the Moon’s presence was quite unusual, making centering more difficult.
Time passed, scans continued, and finally, we reached the maximum eclipse phase!
At one point, I wondered whether we could see lunar relief in the images. However, given the jagged edges typical of SHG imaging, it was hard to say for sure, but, I think some of the details visible below are actual surface details.
By the end of the eclipse, I had 80 SER files, taken between 9:37 AM and 1:43 PM, totaling nearly 175 GB of data! It was time to transfer everything to my desktop PC to create an animation. This also gave me a chance to test whether my earlier workarounds for ellipse detection would function as expected. I ran batch processing, and boom—within minutes, I had this animation:
And here’s the continuum version which was weirdly compressed by Youtube:
This was just a first draft, using a beta version of my software. A few hours later I released a new version of the animation which is visible below:
In the end, the experiment was a success! I also took this opportunity to improve my software, which will benefit everyone. If you have eclipse scans, don’t discard them! Soon, you’ll be able to process them too.
The big question now is: Could this be done during a total solar eclipse, such as next year’s in Spain? Well, I feel lucky that this one was only 25% partial. Managing ellipse detection and mount realignment between scans is already quite tricky. During a total eclipse, there wouldn’t even be a reference point!
Unless one has flawless alignment, a mount capable of returning to position perfectly, and a steady scanning speed, this would be a real challenge. Honestly, it’s beyond my current expertise—it would require a lot more work.
P.S: For french speaking readers in west of France (or simply if you are nearby at that date), we organize the Rencontres Solaires de Vendée on June 7, where we can discuss this topic!
11 March 2025
Tags: solex jsolex solar astronomy
For a few months, I’ve been working with Minh, from MLAstro, to improve support of the SHG 700 in JSol’Ex. While JSol’Ex was initially designed for Christian Buil’s Sol’Ex, it appears that lots of users are using it to process images acquired using a different spectroheliograph. A while back, I added the ability to declare the specifications of the spectroheliograph you are using, which makes JSol’Ex compatible with a wide variety of instruments. An example of collaboration with Minh is the experimental flat correction, which is recommended to enable with the SHG 700, and the addition of the SHG 700 to the list of instruments which are officially supported, in addition to the Sol’Ex and the Sunscan.
As you can see, I had been working with MLAstro for quite some time already, but I didn’t own the instrument. This has recently been "fixed" and I’m now a happy user of the SHG 700!
I must say that Minh’s design is fairly impressive, it’s a truly qualitative instrument. The aluminum housing makes it extremely robust, without any flexion, and it comes pre-collimated. In addition, the micro-focusers, which are used for the collimator lens, for the camera objective and for the wavelength selection wheel, are extremely pleasant to use: anyone who has struggled with the collimation of the Sol’Ex will immediately feel the magic.
Being a regular user of the Sol’Ex, I immediately felt comfortable with the SHG 700: it only took me a few minutes to setup and get my first images! Fine tuning the focus is a breeze with the microfocusers, it’s really fantastic to use.
My first images were showcased by MLAStro, but here are a few:
While the above images were fairly easy to produce, I was puzzled because I didn’t manage to get any decent image in Helium D3. This was surprising, because the sensitivity of the SHG700 is higher, especially because there’s no need for a ND filter or a an ERF, so it was curious that the only helium image I was able to produce was this:
To understand the problem was that, it is important to mention that unlike Minh, I was using JSol’Ex one-click, fully automated processing, a feature which was introduced a while ago (June 2024) and worked extremely well for both Sol’Ex and Sunscan files (N.B: the Sunscan app introduced the same feature a few days ago).
This feature only works if we can properly compute the dispersion of the spectrum, which is measured in Angstrom/pixel. In order to compute this, we need to know the specifications of the spectroheliograph, such as the grating density, the focal length of the collimator and objective lenses, the total angle as well as the pixel size of the camera. Once we know all this, we can determine how many pixels separate the reference line that we use, for example the Sodium D2 line, from the Helium D3 line. To illustrate this, let’s say that we have a dispersion of 0.1Å/px. The Helium line is to be found at 5875.62Å, when the reference line, the Sodium D2 line, is detected at 5889.95Å. Therefore, the Helium line can be found 14.33Å away from the D2 line, which means 14.33/0.1 = 143.3 pixels away.
Using both Sol’Ex and Sunscan, I had great experience at extracting this line automatically, so it was to say the least curious that it wouldn’t work with the SHG 700.
So I contacted Minh, and after eliminating obvious possible candidates, like weak signal or incorrect exposition or gain, I decided to go the "old way" and searched for the helium line manually. It was with great surprise that I discovered that when JSol’Ex told me that the line should be 124.9, I was measuring something closer to 116 pixels! I brought this to Minh, and we identified 3 possible causes for this:
an error in the computation of the dispersion in JSol’Ex
a different focal length of the SHG 700
a grating which wouldn’t have the expected number of lines/mm
It was also possible, but unlikely, that a combination of 2 and 3 would happen. The first thing I’ve done is double checking my formula to compute the dispersion in JSol’Ex. I was doubtful that it could be wrong, given that it was used with success on different SHGs. In addition, it gave exactly the same result as the Ken Harrison' SimSpec SHG spreadsheet.
So we moved to the 2d option: the SHG 700 was advertised with a focal length of 75mm. However, the pixel shift I was manually measuring was closer to 70mm. I brought this again to Minh, who contacted the supplier of the lens, and here’s what he told me:
The lens from the first production run had a focal length of 72mm, a discrepancy I was unaware of at the time. I had sourced this lens from a supplier in China and provided them with the Zemax file for my self-designed 6-element Double Gauss 75mm lens. The first prototype was disappointing—its near-UV performance and coating were poor, and contrast was low across the field. This was largely due to my inexperience in optical design, as I had only begun learning Zemax a month prior, making this my first optical project.
I raised these concerns with the supplier, who was very accommodating and offered to help optimize the design while ensuring key parameters such as focal length, aperture, and exit pupil remained unchanged. With each revision, the lens improved—by the second and third prototypes, contrast was significantly better, sharpness in the blue end of the spectrum improved, and the field was much flatter across the FOV. The field of view closely matched the earlier lens, and because I had already tested focal length in previous iterations, I didn’t think to recheck it. Visually, the lenses appeared identical, except for a slight shift in coating hue.
The third prototype was approved for production and became the MLAstro "75mm" compound lens used in all MLAstro SHGs. However, it was only later confirmed by the supplier that the final production lens actually had a focal length of 72mm instead of 75mm. The optimizations for blue performance and field flattening had slightly shortened the focal length."
So I changed the focal length to 72mm and got this image instead:
and a stack of 5 images processed entirely in JSol’Ex:
That’s quite a difference! So it appeared that the software was correct, and that it allowed identifying a problem in the spectrograph specifications, because a spectral line wasn’t found where it should have been! While going from 75mm to 72mm won’t make much of a difference, the fact of not using the right numbers makes a huge difference in JSol’Ex: computations are all off, which includes the pixel shifts like in this exercise, but also the measured redshifts. In addition, this would make it impossible to perform more complicated tasks like finding the ionized Fe lines when imaging the corona E. The image is stil less contrasted than it should be, which may indicate that the computation is slightly off, or it’s just due to the weather conditions that day, I didn’t have the opportunity to retry.
Lastly, we can actually see fairly easily that the new focal length is a better fit, by using JSol’Ex "profile" tab. In that tab, we compare the profile of the spectrum that you captured with a reference spectrum from the BASS2000 database: this is also how the software automatically determines what spectral line is observed, by comparing the profiles together.
With a 75mm length and an H-alpha profile, here’s what we got:
You can see that while the H-alpha profile is found, as soon as we move towards the wings, there are slight shifts between the local minimas. When we switch to a 72mm length, these are perfectly aligned:
The SHG 700 is a fairly impressive instrument: it’s robust, it’s a pleasure to use with its microfocusers, and Minh is always super responsive and very patient. Its use doesn’t come without drawbacks, though. It’s weight, for example, restricts it to refractors which have a good focuser. The dimension of the sun is also smaller than with the Sol’Ex at equivalent focal length (see this post for an explanation). However, it produces stunning images, sometimes rivalizing these produced with an etalon.
While testing it, I faced this problem that the helium line images were significantly worse than with the Sol’Ex, which didn’t quite make sense. After investigation, it turned out we had highlighted a difference in the specifications, a lens had been changed from 75mm to 72mm by the supplier, without letting MLAstro know. That’s a pity, but, in the end, the problem is very easy to fix in JSol’Ex. Be sure to upgrade to JSol’Ex 2.11.2 which includes a fix to update the focal length.
Older posts are available in the archive.