Shared posts

17 Jun 09:39

A new high-altitude ballooning record

by Liz Upton

Liz: As some of you clever people have pointed out, the new Pi Zero with camera connector might have been designed with one person very much in mind. That person’s Dave “high-altitude ballooning” Akerman. We got one to him before they went on shelves so he could schedule a flight for launch day. Here’s Dave to tell you what happened (spoiler: he’s got another record for the highest amateur live-transmitted pictures). Thanks Dave!

As many reading this will know, I flew the new Pi Zero on the day it was announced, in order to test a prototype of our new PITS-Zero tracker board. I’d been pleading with Eben since I first saw a prototype of the original Pi Zero, that its low weight would be ideal for live-imaging HAB applications, if only it had a camera port. The camera is much the entire reason for using a Pi for HAB – if you don’t want pictures then a smaller/lighter/simpler AVR or PIC microcontroller will easily do the job (and with less battery power) – so I felt that the CSI-less Pi Zero was a missed opportunity. Eben agreed, and said he would try to make it happen.

PiZero1.3_700

So, when I received a sample Pi Zero with CSI port, I was keen to try it out. However launching an unreleased device, to possibly parachute down in front of a curious Pi fan, might not be the best idea in the world, so I had to wait. Fortunately the wind predictions were good for a balloon launch on the Pi Zero CSI launch day, and the flight went well albeit the burst was rather lower than predicted (balloons vary).

Sony Camera

I had hoped to fly the new Sony camera for the Pi, but in testing the camera would become invisible to raspistill after about 2 hours and roughly 2-300 photos. 2 hours isn’t long enough for a regular flight, and mine was expected to take more than 3 hours just to ascend, so this wasn’t good. I searched the Pi forum and found that a couple of people using time-lapse photography had found the same issue, and as it was a new issue with no fix or workaround yet, I had to opt for the Omnivision camera instead. This of course gave me a reason to fly the same tracker again as soon as there was a solution for the Sony firmware issue; once there was I tested it, and planned my next flight.

Waiting For Baudot

"It's currently a problem of access to gigabits through punybaud"

I’ve written previously about LoRa, but the key points about these Long Range Radio Modules when compared to the old (first used from the air in 1922) RTTY system are:

  • Higher data rates
  • Longer range with the same rate/power
  • Can receive as well as transmit
  • Low host CPU requirements even for receiving

The higher data rates mean that we can send larger images more quickly (throughput is up to 56 times that of 300 baud RTTY), and the receiving capability makes it easy to have the payload respond to messages sent up from the ground. For this flight, those messages are used to request the tracker to re-send any missing packets (ones that the receiving stations didn’t hear), thus reducing the number of annoying missing image sections down to about zero. To give you an idea of the improvement, the following single large picture was sent in about a quarter of the time taken by the inset picture (from my first Pi flight, and at the same pixel scale):

progress

Run through LCARS screens

LCARS HAB chase-car dash-mount panel, using Raspberry PI B V2, official touchscreen display, USB GPS, 2-channel LoRa board.

LCARS Chase Car Computer

For this flight, I tried out my new chase-car computer. This has a Pi B V2, Pi touchscreen, LoRa module, GPS receiver and WLAN (to connect to a MiFi in the chase car). The user interface mimics the Star Trek LCARS panels, and was written in Python with PyQt. It receives telemetry both locally (LoRa, or RTTY via a separate PC) and also from the central UKHAS server if connected via GSM.

Run through LCARS screens

LCARS HAB chase-car dash-mount panel, using Raspberry PI B V2, official touchscreen display, USB GPS, 2-channel LoRa board.

The Flight

As per the previous Pi Zero flight, this was under a 1600g balloon filled with hydrogen. Predicted burst altitude was 42km, and I hoped that this time it might achieve that! The payload was the same as last time:

image

…except of course for the new Sony camera (manually focused for infinity, but not beyond) and a new set of batteries.

On the launch day the weather was overcast but forecast to improve a little, so I decided to wait for a gap in the clouds. When that came, the wind did too (that wasn’t forecast!), which made filling the balloon interesting.

No, my head hasn't turned into a giant clove of garlic.

No, my head hasn’t turned into a giant clove of garlic.

Fortunately, the wind did drop for launch, and the balloon ascended towards the gap I’d mentioned in the clouds:

P1110657-1024x768

The LoRa system worked well (especially once I remembered to enable the “missing packet re-send” software!), with the new camera acquitting itself well. I used ImageMagick onboard to apply some gamma to the images (to replace contrast lost in the atmosphere) and to provide a telemetry overlay, including this one, which I believe is the highest image sent down live from an amateur balloon.

Cjaear8WEAA1ENN

Burst was a few metres later, comfortably beating my previous highest live-image flight.

And this was the last image it sent. I guessed why. Remember the camera stuck to the outside? My guess was that after burst – when the payload suddenly finds itself without support – the line up to the balloon found its way behind the camera which it then removed as the balloon remnants pulled on it. So, I can’t show you any images from the descent, but I can show you this shot of the Severn Estuary (processed to improve contrast) from the ascent:

15_20_48_shopped-1024x769

In the chase car, I stopped at a point with a good view towards the landing area, so I could get the best (lowest) last position I could. With the payload transmitting both LoRa and RTTY, I had my LCARS Pi receiving the former, and a Yaesu 817 with laptop PC receiving the latter. With no images, the LoRa side dropped to sending telemetry only, which was handy as I was able to receive a lot of packets as the balloon descended. Overall LoRa seemed to be much more reliable from the car than RTTY did, despite the much higher data rate, and I now would be quite happy to chase a balloon transmitting high bandwidth LoRa and nothing else.

With the final position logged, I carefully tapped that into the car sat nav and then drove off to get the payload back. 10 minutes later I remembered that I’d coded exactly that function into my LCARS program! 2 screen-taps later, I had on-screen navigation (via Navit); I would also have had voice navigation but I hadn’t connected a speaker yet.

Both Navit and the car sat nav took me to a hill with the payload about 300 metres away. I waited for another HABber to arrive – his first time chasing – and meantime I updated the other enthusiasts online, and took some photographs of the scenery; Shropshire is very pretty.

P1110661-1024x768

Once Andy arrived, we walked down to the payload, watched (as often the case) by the local populace:

Ewe looking at me?

Ewe looking at me?

As expected, the camera was missing, so if anyone wants a free Sony Pi camera, I can give you a 5-mile radius area to search.

P1110664-1024x768

You don’t need CSI to see what went wrong here …

A lot of the balloon was still attached, which helps to explain how the camera was forcibly removed:

P1110665-768x1024

So, a nice flight and recovery. The Sony camera worked well; 868 LoRa worked well; the LCARS chase car tracker worked well. Next time, more duct tape!

The post A new high-altitude ballooning record appeared first on Raspberry Pi.

10 Nov 06:58

Another DIY Photobooth

by processedmeat
Hi all, this is the placeholder for my photobooth build. I will be putting up a more extensive how-to later. Version 5.0 Build Here is a photo of my photobooth. It does double duty with the front side being the photo taking side and the backside is the photo viewing and printing kiosk. There ...
By: processedmeat

Continue Reading »
12 Apr 18:28

LEAP Motion and EV3

by Marc-Andre Bazergui (bazmarc)
I pre-ordered a LEAP Motion when I first hear of it and my ultimate goal was to use it with MINDSTORMS but after further thinking and realizing I did not have the programming skills required to pull this off I decided to cancelled my order.

Today I found someone who has successfully manage to combine both.  Dr GermanVargas  from the Department of Mathematics of the College of Coastal Georgia.



The code for his project uses the Leap Motion SDK libraries and the MonoBrick communication libraries to control the EV3 via Bluetooth.

I love the end result and where this technology could go next !!!

Now i feel like getting a LEAP Motion again ;)

: . .