Ah, I did something vaguely similar for lulz a few months ago with a ThinkPad X61 tablet - the hard disks have accelerometers in them.
A few minutes dorkily waving the laptop around showed how the values changed, and I discovered the range differences were so straightforward I was able to compute the device orientation (which way is facing up) using a simple shell script!
Five (okay, maybe ten) minutes later, and X11 was happily rotating and inverting the display automatically as I flipped the thing around. (Seeing Chrome auto-flip exactly like an tablet does (albeit with nice animations) was neat.)
...with one caveat. Every single orientation change made the backlight kick off and back on again.
Tons of printk() debugging was insufficient to fully track down what was causing it. My "solution" was to neuter the bits of KMS that actually did the low-level DPMS calls - the result was some mildly scary display tearing in certain circumstances, but the backlight didn't flicker anymore.
Unfortunately, 'xset dpms force standby' (or 'suspend') no longer worked either - and nothing else that put the display to sleep via DPMS did either. Woops.
libdrm, KMS, DRM and X11 are a mess.
(As for what I was unable to get to the bottom of - something at the X11 level was deciding in certain circumstances that some rotates required "full GPU hard resets" and other rotates didn't. For example rotating left was fine, but rotating right was not. And inverting was fine but switching back to normal was not. But, get this, _if I had an external display attached_ (or had the VGA port forced-on), rotating left and right were both fine! Given that I was able to later reproduce this behavior on both an Intel and AMD system, this is why I glare at KMS/DRM and call them a mess.)
spoiler: has zero to do with quantum entanglement. The author thought that was funny? Cool though
The huge range of solutions that developers with different backgrounds will come up with is evidenced by the fact that when I read this...
The code is pretty straightforward. It opens up a socket to the host, then for each motion update it creates a MotionData value, sets the properties on it, encodes it into JSON and sends it to the script running in Blender. It reads any data the host sends and discards it.
...my first thought was why JSON? I'd be curious to know the reasoning, since if I wanted to do this same task, I'd just send the values directly as binary --- 4-byte floats seem the natural choice here, since that's the representation both sides ultimately want. Also, this protocol is clearly unidirectional, so there's no need to even bother with the other direction.
I recently started learning about how powerful Blender’s python API is. I have a YouTube channel that people support through donations, and every month I send out little 3D printed objects as a thank you gift. I wanted to include a 3D printed little thank you card with each person’s name on it, but obviously I can’t take the time to model a unique card for each person.
After just a few hours of playing around I was able to set up a python script that reads people’s names out of a CSV file, launches Blender as a background process, generates a model of a little plate with text on it thanking a person by name, exports that model as an STL, calls up Slic3r (which can also be run headlessly) to generate the gcode for the printer, and then finally uploads that file to my 3D printer.
Previously I was writing people little notes by hand. Not only is this much cooler, it takes considerably less work from me. I just run a single command to execute the script and then walk over to my printer and hit “print”. A couple hours later I have a pile of little thank you cards.
I really think Blender is the crown jewel of open source software.
Reminds me of when I was playing with an Oculus Rift DK2 kit: