long Time No Update!
Haven’t updated since… May 25th‽ Lots has happened, my appoints went from weekly, to every other week, to monthly (next week will be my second monthly appointment). I think my next bone marrow biopsy is around a year, so mid to late November. I’ve had a few therapeutic phlebotomies (bleedings) so far, but I’ll probably need to continue them for a bit over a year.
Went to Wyoming (flew to Chicago with Collin, where we met Corey, and we drove to Wyoming!) to see the solar eclipse in August. All I wanted for my birthday was for the Sun to block out the Moon for about two minutes, and I got it!
Also, we watched the eclipse on a hilltop in Wyoming, on a lake shore, with about 10 other people, I think maybe all of them were from Colorado. Jenny was one of them, she left a comment on my main page, and then I did something that messed up the comment section on the main page, and I haven’t figured out how to fix it yet.
I spent a lot of time trying to motion stabilize the footage of the eclipse, with mixed results. I shot it in 4K with a Sony A7S ii and a 150-600mm Sigma lens (with a Canon to Sony E-mount adapter) I rented from lensrentals.com. I used opencv to find circles, but that didn’t quite work as well as I hoped. Then I used contour detection, with fairly good results. And then there is the issue of cropping. A month has already gone by since the eclipse, so I’ve set that aside and the video is here now.
And the plane crossing it, that was incredible! In the video, shortly after the eclipse starts, the screen goes black. I was watching the LCD when that happened, and my first thought was, “did the camera just die‽ how could it die right now‽” You can hear, I think Collin, saying how the plane is headed straight for it. I looked at the lens and saw that the breeze had blown the solar filter back in front of the camera, and I managed to flip it back off the lens just in time to get the plane crossing. Later Collin looked up flights for that area and we found two commercial flights that went directly over us during the eclipse, and in the video there are, I think, a couple times where you hear someone mention there are two planes. Collin once, and one of the other women later I think. She also says something about how the plane looks like it’s steering to see the eclipse, and I think at the time both Collin and I thought she meant they had flown into the shadow, and we figured that was planned. But looking at the flight paths later, it looked as if one of the planes banked sharply to one side, then the other, and then back again, possibly to show the passengers? The result looks like a big bump in the flight path, which otherwise is headed north east to south west across the country.
I ended up buying the Sony I rented, and they’re notorious good at shooting video in low light (the reason I rented it). The night before I returned the lens I decided to look for Andromeda, and I didn’t know how easy it is to spot. The app I was using to look for it was so filled with stars I had trouble orienting it right, and decided to look at the Pleiades star cluster first, and I happened to catch a meteor on camera.
Recently I’ve been working on another variation of minesweeper again, and I just now learned a little tiny bit more of Swift, which is actually what prompted me to post. I figured if I write about what I learned I’d be more likely to remember it. It involves the concept of in-out parameters for functions. I had run into them in the past, but didn’t understand what they were doing, but the examples I just read were clear.
I also started by trying to figure out what was meant by some code, a class named Array2D<T>, which it turns out is a generic type. Generic types (apparently) let you write code that will be more flexible than normal, like the way an array type can take integers, floats, strings, etc., as members, that’s what makes them generic. The explanation of generic types involved in-out parameters, which if my understanding is right, are sort of like functions that will change the values of parameters. I think they’re kind of like a shortcut of having a function return modified values and setting the variable you pass into the function equal to the returned value. To use an in-out parameter you must write inout after the parameter name in the function declaration, and when you call it you use an ampersand to denote the inout variable. Here’s the documentation for in-out parameters. And here is the documentation for generic types. (Generic code I guess.)
I also revisited an idea I’ve had for a long time, and worked on quite a bit a few years ago, involving trying to generate point clouds from a video with a pull focus. I managed to get an iPhone app working that would perform a pull focus, and then wrote some opencv code to pull out the in-focus parts, but it does a terrible job. After the bad first attempts I remembered how years ago I read about a simple method for focus stacking (a related problem), that consisted of blurring the photo and subtracting the blurred result from the original. The idea being that the in-focus parts change the most during blurring, and the out-of-focus parts don’t change much, so the result is mostly the parts that were in focus. But the results of that were pretty horrendous too. I also had the color off when I made the video below.
When I return to it again I’ll try implementing something like the contrast-based approach to autofocusing. I suppose something I should keep in mind is that, ideally, each pixel will result in only one point, so I really want to look through the “column” of pixels (a single pixel over the duration of the video), and select the one with the highest contrast? That doesn’t sound terrible…