Flight-Control
S | M | T | W | F | T | S | |
---|---|---|---|---|---|---|---|
51 | 15 15 | 16 16 | 17 17 | 18 18 | 19 19 | 20 20 | 21 21 |
52 | 22 | 23 | 24 | 25 | 26 | 27 | 28 |
01 | 29 | 30 | 31 | 01 | 02 | 03 | 04 |
02 | 05 | 06 | 07 | 08 | 09 | 10 | 11 |
03 | 12 | 13 | 14 | 15 | 16 | 17 | 18 |
|
Hot Projects
S | M | T | W | F | T | S | |
---|---|---|---|---|---|---|---|
51 | 15 15 15 | 16 16 16 | 17 17 17 | 18 18 18 | 19 19 19 | 20 20 20 | 21 21 21 |
52 | 22 22 | 23 23 | 24 24 | 25 25 | 26 26 | 27 27 | 28 28 |
01 | 29 29 | 30 30 | 31 31 | 01 01 | 02 02 | 03 03 | 04 04 |
02 | 05 05 | 06 06 | 07 07 | 08 08 | 09 09 | 10 10 | 11 11 |
03 | 12 12 | 13 13 | 14 14 | 15 15 | 16 16 | 17 17 | 18 18 |
|
Since the Android folks decided that MTP should be the way to connect Android devices via USB I ran into some trouble while trying to get comfortable access to the OnePlus One. There are a couple of forum threads and blog entries out there how to hack some udev rules and use scripts that try to automount the MTPFS. Some recommend mtpfs others use go-mtpfs or jmtpfs. I've tried them all and the result was still not what I wanted to have. Why can't I just plug it in and have it pop up in thunar, like any other USB/Flash device?
Well, the answer is simple: Because I have forgotten to supply the mtp USE flag for gvfs, which manages all mounting related tasks in Xfce4 for me. I also switched to libmtp-9999, because the stable release doesn't know the OnePlus One yet.
Since it's basically a per machine decision, whether it's probable, that it will ever have to mount an MTP device, just put mtp into your global USE flags in /etc/portage/make.conf
Let's see how that would play out:
$ emerge -upvND --with-bdeps=y @world
And voila:
[ebuild U *] media-libs/libmtp-1.1.8 [1.1.6-r1] USE="crypt -doc -examples -static-libs" 0 kB [ebuild R ] gnome-base/gvfs-1.20.2 USE="cdda gtk http mtp* udev udisks -afp -archive -avahi -bluray -fuse -gnome-online-accounts -gphoto2 -ios -libsecret -samba -systemd {-test}" 0 kB
After the emerge and logout/login to get gvfs reloaded you're good to go.
Connected to a Thinkpad x230 USB 2.0 port it sustained about 29MB/s read transfer rate (copying OnePlus One 4k videos to the Thinkpad's SSD), which is more or less the maximum one can get out of USB 2.0 anyways.
For as long as I can remember, Aviation always sparked my interest, so it was only natural that, as a kid, I bought the F/A 18 Interceptor Flight Simulator to play Pilot on my Amiga 500 back in the 80/90 era. We've even had weekend sleepovers where a couple of friends and I took over the attic of another friend and built “cockpits” out of cardboard with cut-outs for our monitors. We basically ignored the combat aspect of it, I was just interested in the machine and flying through below the Golden Gate Bridge since the default base was in San Francisco and the virtual cockpit looked like this:
Since my quest is to use/test/progress and promote open-source, peer-made technology, really checking out Flightgear was long overdue. So, more than 20 years later I find myself back in the cockpit of an airplane sitting on the end of a runway in San Francisco, which again is the default scenery :) But the experience is nothing like it used to be. Back then, we needed an awful lot of imagination to get the level of immersion needed to experience it as fun but this time I needed to go bleeding edge and looked for a realistic Boeing 787 model
The manual was a bit vague where to install it, so I did it like this:
$ cd ~/.fgfs/ $ mkdir Aircraft $ cd Aircraft $ git clone https://gitorious.org/fg-boeing-787-8/fg-boeing-787-8-main.git 787-8
Then I put a link into Flightgear's DATA directroy, which is /usr/share/games/flightgear on gentoo
$ sudo ln -s ~/.fgfs/Aircraft/787-8 /usr/share/games/flightgear/Aircraft/787-8
To start flightgear with the 787-8 type
$ fgfs --aircraft=787-8
and you'll get this awesome cockpit view:
I followed the Kickstarter campaign for a while but then I almost forgot about the Spark Core, which have finally become available. At first glance, the HW design looks solid and it comes with a complete open-source stack, including the hardware design, the firmware for the STM32 and infrastructure/server components.
Other hackers have already become creative, which, to me, indicates a high level of versatility/hackability. It also promises a swifter speed of development compared to other solutions for similar use cases, where you want something ATMega/Arduino-ish in a small package but with full network accessibility:
The light control system I had always in my mind seems rather close now, with a couple of these and some glue parts + infrastructure, so we'll have to get a couple of them to build and publish it :)
[] When you live in Munich and use public transportation, especially Buses and Trams, you will have noticed that during the last couple of month a lot of new displays appeared at almost any station which had no real-time info display before. They obviously have no cables/connections and no visible antennas, so I kept wondering:
What follows is a journey describing how it's possible to answer these questions and learn something completely new in just a couple of days with the help of the Internet and other kindred spirits who shared and published their research and results. Read on and you will learn how the system works and get detailed answers to those questions.
Call for Help:
If you see one of these Displays, please have a look at the top of the left side of the blue case, there should be a 4-digit numerical ID. Please drop
in the PAD, a comment or on IRC.
In case you haven't noticed yet, NASA has put a new experiment on the International Space Station, which is called High Definition Earth Viewing (HDEV). You can watch it online via ustream. The experiment's primary purpose is to generate long-term test data, if cheap off-the-shelf consumer HD cameras can be used in space instead of extremely expensive purpose-built “space” cameras. As a benefit, now all people can watch the world from above, in near-realtime, so thank you guys for sharing.
Playing stuff in a browser is cool, but having it play in mplayer(2)/mpv seems way more flexible (and can also benefit from GPU video offloading). To grab the stream from ustream you can use https://github.com/chrippa/livestreamer, which works very well.
mplayer2 based Systems
$ livestreamer -Q http://www.ustream.tv/channel/iss-hdev-payload best --player "mplayer2 -nosound"
MPV based Systems
$ livestreamer -Q http://www.ustream.tv/channel/iss-hdev-payload best --player "mpv --no-audio"
The next level is to use xwinwrap to put the stream on your desktop, instead of a static image. When you've got alpha blending/compositing in your terms (xterm/urxvt etc.) you can even see it in the background of your shells (way awesome).
When you've installed livestreamer and xwinwrap, just call it like this:
mplayer2 based Systems
$ livestreamer -Q http://www.ustream.tv/channel/iss-hdev-payload best \ --player "./xwinwrap -ni -fs -s -st -sp -b -nf -- mplayer2 -wid WID -nosound"
MPV based Systems
$ livestreamer -Q http://www.ustream.tv/channel/iss-hdev-payload best \ --player "./xwinwrap -ni -fs -s -st -sp -b -nf -- mpv --wid 0 --no-audio"
You can use our xfce-planet script, if you want to keep track of the ISS position on another monitor:
Be advised though, the view is spectacular (especially, when you've got it on the wall with a projector) and tends to hypnotize everyone in the room, while they're grasping the beauty of our planet and forgetting all the superfluous, puny and made up problems, mankind needlessly still seems to fight with.
Last Saturday, instead of breakfast, we wanted to have more geiger counters. So we had an early morning soldering session in the open air and with plenty of sunshine we finished two more 1.0 prototype boards so that we have more active PiGI's for tests and further development.
Back in the early 90's, when I first tried GNU/Linux, there weren't many things I could really do with my X session, due to lack of knowledge, skill, confidence and available open-source software. However, I did play with xearth, a program that renders a somewhat accurate image of our planet. A couple of years later it was replaced by xplanet which offered a lot more features and eye-candy options.
With NASA's release of the visible-earth program we suddenly had open access to high detail day/night, bump (relief) and specular (reflection) maps of the earth which can be used as textures with xplanet.
After playing a bit more with xplanet again for a couple of days in order to get realtime satellite positions directly on the desktop (see xfce-planet) I got frustrated by the cloud layer again. There was a time when some people put up mirrors of the near current (3-6 hours) global cloudmap we could use as a source for xplanet, but now it seems to have been split into some paid subscription model for high resolution and the low resolution image is distributed via CoralCDN, which, although I like the concept, failed constantly in delivering the global cloudmap.
By sheer accident I stumbled upon https://github.com/jmozmoz/cloudmap, so I tried it locally and it worked like a charm which in turn led to the idea to offer the image I need anyways to everyone else who desires to have a fresh high detail cloudmap, without having to set up the required infrastructure. And with that the Global Cloudmap Generator Robot was born, who creates a new cloudmap every three hours and then commits and pushes it to the public global cloudmap repo to use github's infrastructure as CDN we can hopefully rely upon.
And you can just get the latest map by grabbing:
https://raw.githubusercontent.com/apollo-ng/cloudmap/master/global.jpg
If you're interested in how it all works or want to setup your own/independent cloudmap generator, here is a simplified rundown:
Continuing on the final aspect of JamesT's very nice exploration of PiGI, we want to further examine the possibilities to create randomness from the output of a Geiger-counter. Of course we are using a PiGI as testbed for our experiments.
True randomness, in the sense of “provably unpredictable” is not easily available on a computer. Good arguments can be made that the available sources of entropy like clock jitter, floating analog inputs, network traffic etc, combined with the algorithmic magic of /dev/random and /dev/urandom are more than adequate for all non-cryptographic purposes, and probably unpredictable enough to safely use them in cryptography without worrying to much.
But paranoia and cryptography run hand in hand and using radioactive decay as source of entropy is an interesting application of physics to provide access to the elusive “true” randomness on a computer.
While the half-life gives the duration after which a particle has decayed with a probability of 50%, the exact moment of decay can not be predicted in any way. So a sample of radioactive material creates a stream of events in a Geiger-counter, which, due to the enormous amounts of atoms even in the tiniest sample, averages to a steady rate, but the specific event times (or delays between events) are unpredictable.
In the most fundamental form, our task is to produce a stream of equally probable 0's and 1's. To do this, we compare the last two delays between measured events. If one is longer than the other, the next bit is a 0 and if it is shorter it is a 1.
At long last the Felix 3.0B upgrade parts have finally arrived last week and the picoprint underwent a complete overhaul. On Friday night the printer was completely disassembled, except for the bare frame.
All printed plastic parts (V2.0) were packed away (as a backup for now) and all screws, nuts and other metallic parts took a bath in WD-40 for 24 hours and were cleaned, re-sorted and stored in the original container. Having only sunday left, the assembly of the 3.0B went up to the part of implementing proper cable management.
This shot gives a good impression of the amount of cables to route and this is only the bare Felix configuration without any other extras. The next long weekend is directly ahead, so the printer should be fully upgraded, re-pimped and ready to print on the weekend.
About time, I miss the smell of extruded PLA in the morning…
I've seen this “exxploit” truck in the center of Munich a couple of times from the bus but today it was standing right in front of me so I could take a picture and share it because I really like the message: