
[{"content":" Hi! I am a Master's student in the Computer Science in Engineering course at the Hamburg University of Technology (TUHH). Welcome to my portfolio, where I present some of my hands-on projects that bring theory to life. Feel free to take a look!\n","date":"29 March 2026","externalUrl":null,"permalink":"/","section":"","summary":"","title":"","type":"page"},{"content":"Hi!\nSome of my friends have websites too, take a look!\nJoris Gutjahr # I know Joris since a student project in like 2018, where he developed a democratic social network as a student. He is a very strong expert in computer science and history too, take a look at his website!\njorisgutjahr.eu Posts https://jorisgutjahr.eu/blog/ Jonathan Grotelüschen / Tippfehlr # I know Jonathan from orchestra, we visit the chaos computer club congresses together. At 37C3 in 2023, we developed a hilarious multiplayer \u0026ldquo;game\u0026rdquo; in Python with the pygame library (check out Lemon Game)\nFigure 1: Logo of the lemon game! tippfehlr.dev tippfehlr tippfehlr’s website https://tippfehlr.dev/ Pancake Appletree # Pancake Appletree is a very smart student in school who already grasped Arch Linux and all the keyboard tricks.\nappleptree.frama.io Welcome! — Appleptree https://appleptree.frama.io/ Jaron Kramer # Please take a look at his website, it has an awesome design! Jaron is also a very bright student still in school who already trains his own AI networks, rents GPUs to run LLMs himself and has a home assistant instance at home with his own voice assistant.\ndev.jkramertech.com Jaron Kramer | Portfolio Schüler \u0026amp; Entwickler mit Fokus auf Python, KI und Automatisierung. https://dev.jkramertech.com We know each other, and you have a custom website too? # Send a simple Email ","date":"29 March 2026","externalUrl":null,"permalink":"/friends-blogs/","section":"","summary":"","title":"Friends","type":"page"},{"content":"","date":"9 March 2026","externalUrl":null,"permalink":"/articles/","section":"Articles","summary":"","title":"Articles","type":"articles"},{"content":"","date":"9 March 2026","externalUrl":null,"permalink":"/tags/docker/","section":"Tags","summary":"","title":"Docker","type":"tags"},{"content":"","date":"9 March 2026","externalUrl":null,"permalink":"/tags/ios-automation/","section":"Tags","summary":"","title":"IOS Automation","type":"tags"},{"content":"Hi!\nSince I needed something to procrastinate about, I thought of finally getting rid of using the Google Tasks app! The problem is, that most task apps are loaded with WAY too many features. So I thought of developing / vibecoding my own workflow!\nThis is what I set up! (Click to zoom)\nFigure 1: illustration of the complete workflow My requirements # To-Do list easily accessible on all my devices (Linux Mint, macOS, iOS) To-Do list self-hosted, light-weight → No additional cost, privacy Good iOS app with widget showing the tasks Vikunja as the core system # Vikunja is a self-hostable to-do list management app that can be set up easily with docker. It has a nice, simple web-interface and the option of providing APIs. There exist two non-affiliated iOS apps that provide a native UI for the Vikunja To-Do list. I chose \u0026ldquo;Kuna\u0026rdquo; for its simplicity.\nThe challenge: Both Vikunja iOS apps don\u0026rsquo;t have a widget that displays the To-Do list on the home screen!\nBonus: The attachment contains the docker-compose.yml for setting up Vikunja yourself\nThrow together a custom iOS widget # Figure 2: Homescreen with custom widget The iOS app Scriptables provides automation to iOS and the option of custom widgets, other than the native ShortCuts app from Apple. It asks the Vikunja To-Do Server, parses the answer and displays the first five tasks nicely. You find the corresponding code in Attachment 2.\nOpen Kuna app on widget click # Video 1: Demo video showing one-click-UI-path from widget to Kuna app I wanted to be able to touch the widget and then directly edit the To-Do list. This was a little tricky:\n1. Touching a widget opens URL\nWith the widget generated by Scriptables, a URL can be opened on click. However, how to open an app with a URL??\n2. URL opens app\nIn fact, some iOS apps can be opened with a so-called \u0026ldquo;Custom URL Schemes\u0026rdquo;! The difference is in the beginning: Instead of having https://\u0026lt;domain\u0026gt;/loaction, one has: \u0026lt;appname\u0026gt;://\u0026lt;in-app-location\u0026gt;. If you are on an iPhone, you can easily try it out yourself - just open whatsapp:// in safari which correspondingly opens Whatsapp. Unfortunately, the \u0026ldquo;Kuna\u0026rdquo; To-Do app cannot be opened like this, but for the ShortCut app, its possible using shortcuts://.\n3. Execute shortcut directly from URL\nWhen customizing the URL even more, one can execute shortcut workflows: shortcuts://run-shortcut?name=OpenKuna. The shortcut itself is simple:\nFigure 3: Shortcuts screenshot showing automation that openes Kuna app Why not directly use the iOS shortcuts app? # Figure 4: Complex iOS ShortCuts workflow parsing the Vikunja API directly This is the workflow I initially created on the iPhone. It accesses the API similarly like Scriptables – the shortcut can even be added to the home screen! However, the widget cannot show the To-Do list as it\u0026rsquo;s just a button - not good!\nResult # I now have a proper, self-hosted To-Do management system. It\u0026rsquo;s cool to play around with the APIs, you get to know soo main possibilities! However, it bugs me that the animations in iOS are so slow and Apple really shows every step in between the automation.\nShould you set this up too?\nI guess it doesn\u0026rsquo;t make sense to go through the struggle for only getting a To-Do list on your home screen xD. Also, you don\u0026rsquo;t need to host your own Vikunja To-Do server, they also host it for you.\nThat\u0026rsquo;s it! Thanks for reading.\nI am interested in your thoughts! - Reply with a simple Email Have a nice day,\nCarl\nAttachments # Attachment 1: Vikunja to-do server docker-compose.yml # version: \u0026#39;3.8\u0026#39; services: vikunja-db: image: mariadb:10 container_name: vikunja-db command: --character-set-server=utf8mb4 --collation-server=utf8mb4_unicode_ci environment: - MYSQL_ROOT_PASSWORD=choose_password_root # adjust password - MYSQL_USER=vikunja - MYSQL_PASSWORD=choose_password_for_mysql # adjust password - MYSQL_DATABASE=vikunja volumes: - /path/on/host/vikunja/db:/var/lib/mysql # adjust host path restart: unless-stopped networks: - docker_website_network vikunja: image: vikunja/vikunja container_name: vikunja-app ports: - 8080:3456 # Maps the internal port 3456 to your host\u0026#39;s 8080 environment: - VIKUNJA_DATABASE_HOST=vikunja-db - VIKUNJA_DATABASE_TYPE=mysql - VIKUNJA_DATABASE_USER=vikunja - VIKUNJA_DATABASE_PASSWORD=choose_password_for_mysql # adjust password - VIKUNJA_DATABASE_DATABASE=vikunja - VIKUNJA_SERVICE_JWTSECRET=choose_password_ - VIKUNJA_SERVICE_PUBLICURL=https://your_domain.de # adjust to your domain - VIKUNJA_CACHE_ENABLED=true - VIKUNJA_CACHE_TYPE=memory - VIKUNJA_SERVICE_ENABLEREGISTRATION=false volumes: - /path/on/host/vikunja/files:/app/vikunja/files # adjust path - /path/on/host/vikunja/config.yml:/etc/vikunja/config.yml # adjust path depends_on: - vikunja-db restart: unless-stopped networks: - docker_website_network networks: # make sure the network can be accessed from the internet docker_website_network: external: true Attachment 2: Scriptables app on iPhone: Javascript code # // Vikunja API config - adjust! let url = \u0026#34;https:your-domain.de/api/v1/projects/1/tasks?filter=done%20%3D%20false\u0026#34; // \u0026#34;filter=done%20%3D%20false\u0026#34;: directly filters out finished tasks! // Get token from vikunja web UI let key=\u0026#34;your_vikunja_api_token\u0026#34; let req = new Request(url) req.headers = { \u0026#34;Authorization\u0026#34;: key } let tasks = await req.loadJSON() let widget = new ListWidget() widget.backgroundColor = new Color(\u0026#34;#1a1a1a\u0026#34;) // Execute the OpenKuna shortcut to open the Kuna app // This needs to be setup by yourself in the Shortcuts app widget.url = \u0026#34;shortcuts://run-shortcut?name=OpenKuna\u0026#34;; // Build widget let title = widget.addText(\u0026#34;To-Do\u0026#34;) title.font = Font.boldSystemFont(14) title.textColor = Color.white() // Show first 5 tasks tasks.slice(0, 5).forEach(task =\u0026gt; { let t = widget.addText(\u0026#34;- \u0026#34; + task.title) t.font = Font.systemFont(12) t.textColor = Color.lightGray() }) if (config.runsInWidget) { Script.setWidget(widget) } else { widget.presentMedium() } Script.complete() ","date":"9 March 2026","externalUrl":null,"permalink":"/articles/todo-system/","section":"Articles","summary":"","title":"Self-hosted To-Do list with custom iOS widget","type":"articles"},{"content":"","date":"9 March 2026","externalUrl":null,"permalink":"/tags/","section":"Tags","summary":"","title":"Tags","type":"tags"},{"content":"","date":"9 March 2026","externalUrl":null,"permalink":"/tags/vikunja-task-management/","section":"Tags","summary":"","title":"Vikunja Task Management","type":"tags"},{"content":"","date":"9 March 2026","externalUrl":null,"permalink":"/tags/virtual-private-linux-server/","section":"Tags","summary":"","title":"Virtual Private Linux Server","type":"tags"},{"content":" Hi!\nIn my computer science in engineering master’s degree, I recently had the course \u0026ldquo;bioelectromagnetics\u0026rdquo;, or in short BEM. BEM deals with the interaction of electromagnetic fields with biological tissue.\nThe course went well in terms of the grade, and since the topic is exotic, I thought, maybe you are interested in what it\u0026rsquo;s about! In the following you see one of my glossaries - usually I write them a few days before the exam. In the process of writing the glossary, I parse all the knowledge from the lecture contents and group exercises through my brain, which helps me to reflect the course content. The glossary is definitely not meant to explain everything! But it maybe provides some idea in what topics we dived in. If you got intrigued, then you can read the book \u0026ldquo;Bioimpedance and Bioelectricity Basics\u0026rdquo; by Sverre Grimnes and Ørjan G Martinsen. A lot of our lecture content is based on it.\nA short introduction # In the BEM course, we divided the topics mostly by frequency of the electromagnetic waves:\nFrequency\nAt frequencies below 1MHz, we modelled biological tissue with electrical circuits - meaning that effects can be described best with circuits of simple resistors and capacitors. Between 1MHz and 10Thz, we modelled the effects using planar EM waves. These are much more complicated - major effects are the power loss density along the EM wave traveling through the body that results in heat, as well as polarization, reflection and refraction. Between 10 THz and 2 PHz (being $2 \\cdot 10^{15}$ Hz), we dealt with electromagnetic effects as being rays. The interval also includes green light at a frequency of around 600 THz. This frequency range is not that interesting since the only effect is heat dissipation. In the last frequency range – above 2 PHz – we treat EM waves as photons. The 2 PHz frequency marks the transition between non-ionizing and ionizing radiation: Above 2 PHz, the energy of light / EM waves is enough knock electrons off hydrogen atoms, which can damage DNA and potentially cause cancer.\nSize\nWe both took a look at microscopic level (being) single biological cells and also macroscopic objects like the heart or the human as a whole.\nApplications\nThroughout the course we looked at a lot of different applications, for example electric cardiography, electroencephalography (EEG), magnetic resonance imaging, tasers and their health impacts and more applications.\nI am interested in your thoughts! - Reply with a simple Email Have a nice day,\nCarl\nBioelectromagnetics glossary # Open PDF in new tab ","date":"2 February 2026","externalUrl":null,"permalink":"/articles/bioelectromagnetics/","section":"Articles","summary":"","title":"Bioelectromagnetics glossary","type":"articles"},{"content":"","date":"2 February 2026","externalUrl":null,"permalink":"/tags/biology/","section":"Tags","summary":"","title":"Biology","type":"tags"},{"content":"","date":"2 February 2026","externalUrl":null,"permalink":"/tags/chemistry/","section":"Tags","summary":"","title":"Chemistry","type":"tags"},{"content":"","date":"2 February 2026","externalUrl":null,"permalink":"/tags/electromagnetism/","section":"Tags","summary":"","title":"Electromagnetism","type":"tags"},{"content":"","date":"2 February 2026","externalUrl":null,"permalink":"/tags/maxwell-equations/","section":"Tags","summary":"","title":"Maxwell Equations","type":"tags"},{"content":"","date":"2 February 2026","externalUrl":null,"permalink":"/tags/medical-applications/","section":"Tags","summary":"","title":"Medical Applications","type":"tags"},{"content":"","date":"2 February 2026","externalUrl":null,"permalink":"/tags/rf/","section":"Tags","summary":"","title":"RF","type":"tags"},{"content":"Hi!\nFor the past few years, I have been working on private electronic projects. With growing knowledge from books and university, the concepts become increasingly complex - I need a way to cleanly document my process of making:\nThis is what I developed! (click to zoom) Figure 1: Illustration showing the data flow My requirements of a blogging system # Low effort to write and publish - otherwise I won\u0026rsquo;t write Runs on already rented VPS server to avoid additional cost \u0026amp; reuse domain Lightweight and fast Hugo as the core system # The core system is a static site generator called Hugo framework. It\u0026rsquo;s lightweight and generates fast and sleek websites. The visuals can be customized with a plethora of themes (This blog uses the blowfish theme).\nWrite articles in Markdown format Preview website locally Execute hugo server in CLI Access web page under http://localhost:1313 Deploy using Nginx Webserver. Pro # You are in control (you build and serve the website yourself) Markdown is used The whole website is a \u0026ldquo;simple\u0026rdquo; folder Use your own editor to write Markdown Con # no on-website article editor for quick changes pretty involved workflow, you will definitely need some hours to get up and running (I needed around 20-30 hours) 1. Step: Own Git server (using gitea) # To get the blog website, a bunch of build files collected in a directory is needed. The directory contains:\nArticles in Markdown format Images Configuration files in TOML format Themes Dockerfile and docker-compose.yml In addition, git adds versioning of your articles.\nNow one could ask: Why do you need an OWN Git server? Because you need to store the photos somewhere! My blog, starting with two articles is already 350 MB in size!\n2. Step: Webhook on Push # When committing a new article or change to my repo, I want the website to update. So I set up that a git push command will tell my portainer service!\n3. Step: Container rebuild # Portainer is a service that manages docker containers. When a notification from git arrives, it pulls the repo and sets up a new nginx webserver:\nDockerfile\n# 1. Get requirements FROM hugomods/hugo:exts AS builder # 2. Prepare workspace WORKDIR /src COPY . . # 3. Clean up any broken submodule leftovers and clone fresh RUN rm -rf themes/blowfish \u0026amp;\u0026amp; \\ git clone --depth=1 https://github.com/nunocoracao/blowfish.git themes/blowfish # 4. Build the site (Using -D for drafts and --gc for cleanup) RUN hugo -D --gc # 5: Serve the site with Nginx FROM nginx:alpine RUN rm -rf /usr/share/nginx/html/* COPY --from=builder /src/public /usr/share/nginx/html EXPOSE 80 docker-compose.yml for setting up the docker container stack\nversion: \u0026#39;3\u0026#39; services: blog: build: . restart: always networks: - docker_website_network # Join the specific network networks: docker_website_network: external: true # \u0026#39;external\u0026#39; means \u0026#34;use the existing network, don\u0026#39;t create a new one\u0026#34; 3.5 Step: SSL \u0026amp; Monitoring # For management of domains and subdomains, I set up the NGINX reverse proxy. It also helps manage certificates so that one can access the website via HTTPS.\nAdditionally, I want to know about how well my website performs! I chose Beszel, Dozzle and Umami to give me system stats, docker stats and website visitor stats. These services have their own dashboard which I locked away. Setting these up took me half a day.\n4. Step: HTTPS # You can access the website! The website is simple, static, and loads fast.\nChallenges # There were some significant challenges and questions to get this blog and running!\nBefore thinking about file storage size, I used a public git provider - however, they don\u0026rsquo;t allow much storage per repository - so I decided to spin up my own Gitea instance.\nGemini told me to use the LFS feature in git to more efficiently store images in repos. However, after I moved to my own git instance, I wanted to deactivate LFS for simplicity - this turned out to be NOT easy.\nThe CI/CD pipeline takes some time to get up. The webhook in step 2 is easy, but the automatic container build is not - also, if the container setup fails, you mostly get only error numbers, no detailed error logs.\nHugo recommends adding themes as git submodule. This turned out to be not compatible with the automatic build process of portainer\nUse real name or pseudonym? At the end, I thought a real name would be cooler.\nHardware needed # This website is served using the lowest tier model of a rented Linux VPS (virtual private server) at 1blu.de:\n4 CPU cores 8 GB Ram 120 GB SSD A VPS server means, that the resources are shared, but SSH and root access is provided. In my experience, this hardware tier model is enough to run a Minecraft server with at least 8 players, and an InfluxDB and Grafana dashboard service for my beehive monitoring too.\nHow to publish using my workflow # Edit the local Hugo project by adding new article in Markdown Preview your articles locally in your browser Push changes to online git repo Wait for automatic update (around 30 seconds) My Git service informs the portainer docker container on my vps server Portainer pulls my git repo and rebuilds the whole stack The new stack contains only a simple nginx container with the new static website You can access the new article. Conclusion: Should you also choose Hugo? # So if you are into DIY-server-management tasks, this is definitely for you. I don\u0026rsquo;t have any long-term experience with it yet, but I assume, it won\u0026rsquo;t be that maintenance-heavy. Since I already write a lot of notes with markdown and use git nonetheless, the workflow won\u0026rsquo;t be a problem for me.\nBut if you are not into these tasks, you should definitely choose a simpler alternative that doesn\u0026rsquo;t take so long to get up and running - in my case it took like 3 holidays in between university lectures! Maybe, there are services on the internet that manage hugo for you?\nSo let\u0026rsquo;s see - if there are more articles popping up, then this system works xD\nI am interested in your thoughts! - Reply with a simple Email Have a nice day,\nCarl\n","date":"16 January 2026","externalUrl":null,"permalink":"/articles/tech-blog/","section":"Articles","summary":"","title":"Building a self-hosted Tech Blog","type":"articles"},{"content":"","date":"16 January 2026","externalUrl":null,"permalink":"/tags/git-ci/cd/","section":"Tags","summary":"","title":"Git CI/CD","type":"tags"},{"content":"","date":"16 January 2026","externalUrl":null,"permalink":"/tags/gitea-self-hosted-git/","section":"Tags","summary":"","title":"Gitea Self-Hosted Git","type":"tags"},{"content":"","date":"16 January 2026","externalUrl":null,"permalink":"/tags/hugo-website-framework/","section":"Tags","summary":"","title":"Hugo Website Framework","type":"tags"},{"content":"","date":"16 January 2026","externalUrl":null,"permalink":"/tags/portainer-docker-management/","section":"Tags","summary":"","title":"Portainer Docker Management","type":"tags"},{"content":"","date":"29 November 2025","externalUrl":null,"permalink":"/tags/gnu-radio-companion/","section":"Tags","summary":"","title":"GNU Radio Companion","type":"tags"},{"content":"","date":"29 November 2025","externalUrl":null,"permalink":"/tags/hackrf/","section":"Tags","summary":"","title":"HackRF","type":"tags"},{"content":"","date":"29 November 2025","externalUrl":null,"permalink":"/tags/reverse-engineering/","section":"Tags","summary":"","title":"Reverse-Engineering","type":"tags"},{"content":"Hi!\nUntil the end of 2025, I worked as an assistant for RF (Radio Frequency) engineers. It can be a very theoretical field, so I decided to do some hands-on projects at home! I wondered: Could I reverse-engineer a simple toy car remote and figure out how it communicates with the car? It turns out, it\u0026rsquo;s possible! And you can do much more, too!\nDemo # I was able to play a computer racing game with a toy car remote!\nVideo 8: Playing a racing game with a toy car remote Overview\nFigure 46: Overview This is the system I created – I\u0026rsquo;ll explain it in this article and describe the process of reverse-engineering the remote radio connection and developing a custom, real-time decoder. Although this might sound intimidating, I want to make the topic approachable by avoiding using a lot of technical terms and relying on plenty of photos and videos.\nIntroduction # Thanks to my time at university, I already have some background in signal processing, communication engineering and related fields. I read about the HackRF and GNU Radio on the internet, so I decided to give it a try. I bought a HackRF and a toy car from eBay Kleinanzeigen (the German equivalent of Craigslist), so I had something to start with!\nLet\u0026rsquo;s start small by looking at the toy car.\nLightning McQueen! (The actual toy car) # Figure 47: toy car Figure 48: toy car remote controller Video 9: Lightning McQueen Test! The toy car works, yay! However, I did need to replace the batteries.\nIntroduction to the radio world # So let\u0026rsquo;s start simple - what are radio signals about?\nWhen alternating the current in a wire multiple millions of times per second, the electric and magnetic field around the wire start to interact with each other and propagate (travel) away from the wire into the air. We call the wire an antenna, and it transmits at a frequency. We usually do this in Megahertz (MHz), where 1 MHz means one million cycles per second.\nNow, as these electromagnetic waves propagate through the room, they will bump into other wires. When they do, they create a tiny voltage inside those wires at the exact same frequency, which we then measure. By using electronic components like amplifiers and filters, we can make this tiny signal usable.\nThe cool thing is that one can choose exactly what frequency to transmit and receive on! And there are a massive number of different frequencies – ranging from a few MHz to like 60 GHz. Here is a short overview:\n1 kHz - 1,000 periods per second 1 MHz - 1,000,000 periods per second 1 GHz - 1,000,000,000 periods per second 1 THz - 1,000,000,000,000 periods per second Now the complete collection of all frequencies is called the \u0026ldquo;frequency spectrum\u0026rdquo;. If you need a specific range of frequencies for your project, you use a frequency \u0026ldquo;band\u0026rdquo;.\nSome frequency examples:\nCa. 100 MHz: Typical kitchen FM radio music Ca. 150 MHz: Amateur (HAM) radio people talking to each other 2.4GHz: Wi-Fi, Bluetooth, Microwave ovens 5 GHz: Faster Wi-Fi Now you might think, electro-magnetic waves can only be used for communication, but there are many other uses! For example:\nCa. 50kHz: Wireless charging (at this low frequency, it relies mostly on magnetic coupling effects rather than true radiating radio frequencies) Ca. 60 GHz: Radar applications that measure distance and speed. Ca. 550 THz: Visible, Green light! (Light is an electromagnetic wave too!) When engineers develop radio communication systems, there are a lot of tasks to be done:\nChoose frequency Develop radio components: Antenna (physical dimensions, material), Amplifiers, filters Develop radio modulation: Decide which property of the radio signal actually carry information Develop radio protocols: Figure out rules who should transmit and receive at which exact time, so that everyone\u0026rsquo;s signals don\u0026rsquo;t crash (interfere) into each other. Develop software: Make this technology accessible so that software developers can include it in their products easily! What is a HackRF # Figure 49: hackrf and portapack combo My personal goal is to understand and model the radio connection between the car and the remote. To get started tinkering with radio communication, a device like a HackRF is needed.\nA HackRF is like a Swiss Army knife for receiving and also transmitting radio signals between 1 MHz and 6 GHz. It can be used to take part in amateur radio conversation, know what planes are over your head, hack toys and more. A friend of mine uses the HackRF to receive and decode images transmitted by satellites. In radio/RF engineering world, it\u0026rsquo;s called a software defined radio, in short SDR. And yes, like any tool, this device can be used for illegal stuff too - like hacking car keyfob signals to break into cars. I will refrain from this and use the device only for receiving remote commands for educational purposes.\nActually, the HackRF is just a green PCB inside the device I am holding in my hand. What you see is an extension of the HackRF called \u0026ldquo;Portapack\u0026rdquo; to make it portable! It adds the nice screen, buttons, battery and more. If you are interested in what the HackRF looks like without the extension, take a look at Attachment 1: The inside of the HackRF/Portapack.\nChapter 0: Let\u0026rsquo;s look more closely at the remote # Figure 50: Remote controller back side Do you spot the \u0026ldquo;40 MHz\u0026rdquo; label? Maybe that\u0026rsquo;s the frequency the remote is communicating on? If you are interested in what the disassembled car and remote look like, refer to attachment 2.\nChapter 1: Finding the toy car remote signal # The first challenge is to find the remote signal. Now with the HackRF/Portapack, we can use the tool \u0026ldquo;Looking Glass\u0026rdquo;, which shows us the signal strength at all frequencies. Let\u0026rsquo;s tune to around 40 MHz!\nVideo 10: Find the signal in the spectrum What did I do in Video 10 :\nBoot the HackRF/Portapack Open the Looking Glass App to find signals Select a frequency range that\u0026rsquo;s around 40MHz - to be specific: 38 MHz - 45 MHz Whenever we select a frequency span of less than 20MHz, then we see a bright vertical line exactly in the center of the screen. We just need to ignore this. When moving the joysticks, another bright vertical line appears on the screen, with some blue noise on the sides. That\u0026rsquo;s the signal! By the way: The display type is called a waterfall diagram. The frequencies are on the horizontal axis, time is on the vertical axis. The marker tool shows that the signal is not exactly 40MHz, its more between 40.6MHz and 40.7MHz (Actually, it\u0026rsquo;s 40.685MHz). What\u0026rsquo;s exactly on the HackRF screen?\nFigure 51: Illustration explaining the frequency spectrum waterfall diagram There are even more observations\nThe signal is completely gone when not sending commands The frequency used seems to be always the same for every command It\u0026rsquo;s difficult to guess the type of modulation (which properties of the signal tell the commands like steering, accelerate and more) Why did the manufacturer choose exactly 40.685 MHz?\nSince radio frequencies are accessible to anybody and can be used for all kinds of communication applications and even more, it\u0026rsquo;s a highly valuable resource. The German Bundesnetzagentur has allocated specific frequency ranges/bands very precisely to different applications/companies. One can download a table with all the frequency allocations in Germany – on page 753, the important entry for us can be found:\nFigure 52: German frequency allocation excerpt for remote controlling RC models (like our toy car) The frequency range 40,66 MHz to 40,7 MHz is specifically allocated for the remote models – like our toy car.\nChapter 2: Use GNU Radio companion to find the commands # Connecting the HackRF to a computer allows for much more functionality. Instead of the Portapack extension processing the signal, now the computer does it with much more processing power and flexibility.\nWhat is GNU Radio companion (GRC)?\nFigure 53: GNU Radio Companion in action The bundle GNU Radio and GNU Radio Companion (GRC) is a digital signal processing playground. It\u0026rsquo;s basically a tool kit with the following procedure:\nWith the GNU Radio companion, complex decoder and transmitter flows can be designed by drag-and-dropping blocks (seen in figure 8) When clicking the play button: Python code is automatically generated and executed, which depends on the GNU Radio Python library Live graphs can be shown, like to be seen in figure 8 or 10. Let\u0026rsquo;s look on our receiver data flow in figure 9: Inside the white area in the center of the window, multiple blocks can be seen. The first, big block on the left, called \u0026ldquo;osmocom source\u0026rdquo;, is the HackRF signal source. The block on the right, called \u0026ldquo;QT GUI Waterfall Sink\u0026rdquo; is the receiver of the HackRF signal. When pressing play, the \u0026ldquo;QT GUI Waterfall Sink\u0026rdquo; block opens the window shown in figure 10. Both blocks can be configured with a lot of different parameters like sample rate, center frequency and more.\nFigure 54: GNU Radio Companion Flow Graph Figure 55: GNU Radio waterfall diagram showing the remote signal with the button sometimes pressed, sometimes not. Now one could ask - why the hell should I use the complicated GNU Radio companion instead of the built-in functions on the HackRF/Portapack? Let\u0026rsquo;s compare:\nWhen to use the integrated HackRF/Portapack functions\nWhen being on the go When only simple, common functionality is needed When starting out with an experiment When to use GNU Radio companion\nFor custom signal processing! We see later what it means. For a bigger screen and more graphs - with our new screen, we see the signal is exactly at 40.685 MHz Very complicated to get up and running though! Chapter 3: Use GRC to find out more about the signal encoding # Now comes the part, where we actually need a tool like GRC. With the signal recorded with the HackRF, we want to find out the signal-properties that tell us the command from the remote controller – aka what kind of modulation the remote controller uses! In university lectures like communication fundamentals, different modulation techniques are presented.\nLet me present to you some common, very simple modulation techniques:\nAmplitude modulation (AM)\nFigure 56: Amplitude modulation. Blue wave: Transmitted signal. Green wave: Actual information This is the simplest method: The amplitude (similar to the signal strength) actually tells us the information. In old radios, the AM was used to transmit music. And it probably was analog - meaning there was an infinite range of levels between a low and high amplitude. Now in digital communications, there are only certain amplitude levels:\nFigure 57: Amplitude shift keying modulation. Blue wave: Transmitted signal. Green wave: Actual information The most simple digital amplitude modulation is so-called On-Off-Keying – being like Morse code: There are only two amplitude levels: On or off!\nFigure 58: On-Off-Keying modulation. Blue wave: Transmitted signal. Green wave: Actual information Frequency modulation (FM)\nFigure 59: Frequency modulation. Blue wave: Transmitted signal. Green wave: Actual information Now in this method, we actually don\u0026rsquo;t vary the amplitude, but the frequency (very slightly). This can be observed by the receiver, which listens to a frequency range. Kitchen radios use FM since the 1970s. If we want to transmit digital information instead of sound, then so-called \u0026ldquo;frequency shift keying\u0026rdquo; can be used:\nFigure 60: Frequency shift keying modulation. Blue wave: Transmitted signal. Green wave: Actual information Phase shift keying (PSK)\nFigure 61: Phase shift keying modulation. Blue wave: Transmitted signal. Green wave: Actual information In this case, the frequency and amplitude stay the same, but the signal resets to the starting point sometimes, which transfers the information. This could also be used.\nThere are many more! Quadrature Amplitude modulation, Quadrature phase shift keying, etc.\nWhat kind of modulation could our signal be?\nLet\u0026rsquo;s compare the waterfall diagram of typical kitchen frequency-modulated radio signals with the one of the remote controller signal. I recorded both - all parameters are the same except the center frequency of the HackRF.\nFigure 62: Waterfall diagram of typical kitchen FM radio signal Figure 63: Waterfall diagram of the remote signal with the button always pressed. What do we observe?\nMany more signals in Figure 62 : In the waterfall graph, if we look closely, we see many signals displayed by yellow-ish vertical lines. The most dominant signal in Figure 62 being at ~98.1 MHz, the second to most dominant one at ~102.1 MHz. In the second graph, we see mostly one main signal – that\u0026rsquo;s our remote controller signal at 40.685 MHz. Perfect red line in the center: In both graphs, we have a deep red center line. We need to ignore it as it\u0026rsquo;s an artifact of the HackRF. The important point: The FM signals in Figure 62 seem to wobble around a specific center frequency, while our remote controller signal in Figure 63 is much more focussed at a single center frequency. Since the controller signal at 40.685 MHz basically doesn\u0026rsquo;t vary in frequency, we can assume it uses some kind of amplitude modulation! Chapter 4: Another viewing angle of the signal to reveal the protocol # Since we have the suspect amplitude modulation (AM), we can start building a receiver. Let\u0026rsquo;s look again at the amplitude modulation:\nFigure 64: Amplitude modulation. Blue wave: Transmitted signal. Green wave: Actual information What we receive with the HackRF, is something like the blue wave. However, the green wave displays the amplitude – we want to get that green wave! The plan:\nFigure 65: Record AM data at 40.685 MHz Osmocom Source block: we receive signals at a center frequency of 41MHz and a span (aka band) of 5MHz (so we receive signals between 38.5 MHz and 43.5 MHz). Frequency Xlating FIR Filter block: Complicated block to select only our frequency of 40.685 MHz frequency range Complex to Mag block: Get the magnitude of our signal, which is essentially the same as the amplitude QT GUI Time Sink block: Just a live plot that displays the magnitude over time The following graphs now show just the signal amplitude, which is the green wave in Figure 64 .\nWhat do we observe?\nFigure 66: Live remote controller signal amplitude when going forward Wow! That looks promising! What is going on when trying another button?\nFigure 67: Live remote controller signal amplitude when going backward And when not pressing anything?\nFigure 68: Live remote controller signal amplitude when not pressing any key Okay, that\u0026rsquo;s pretty cool, huh? That delivers several new insights!\nThere only seem to be two major amplitude levels. That means we have an OOK-type of amplitude modulation! (Either the signal is on, or off). If a key is continuously pressed, one can see a sequence of pulses that gets repeated. This sequence can be divided into two alternating phases: Phase 1 consists of four long pulses and is the same for every remote command. It could be a preamble to wake the toy car up! Phase 2 consists of short pulses – the number of which changes depending on the key pressed. Here is a quick illustration on my thoughts:\nFigure 69: Illustration explaining the command structure I tried every command on the controller. Phase 1 is always the same, these are the pulse counts in phase 2:\nCommand Pulse Count STOP 4 FORWARD 10 LEFT 58 RIGHT 64 BACKWARD 40 FORWARD_FULLSPEED 22 FORWARD_FULLSPEED_LEFT 28 FORWARD_FULLSPEED_RIGHT 34 BACKWARD_RIGHT 46 BACKWARD_LEFT 52 Side-note: We need to hope the car does not do an initial hand-shake with the remote when they are turned on. Also, we hope, there is not a more intricate protocol involved, and the remote controller just repeats sending the commands to the car.\nChapter 5: Use Python to find out even more # In the last chapter, we got a pretty good idea of the protocol system of the remote. It\u0026rsquo;s cool to be able to decode commands now by hand by looking at the graphs. But it would be even cooler to do the decoding automatically! Essentially, we want GRC to automatically detect the preambles in phase 1, then count the pulses in phase 2 and somehow match it to the commands.\nProblem: For automatic detection of the preambles and the pulses afterwards, we need the frequency of how many times per second the magnitude flips to on or off. In communication engineering, we call that \u0026ldquo;symbol rate\u0026rdquo; - the number of events per second. To detect that, we can take a look at the frequency of the signal.\nThe plan: First, with the following GRC flow, we record the magnitude when a command is pressed continuously:\nFigure 70: Record AM data at 40.685 MHz Second, we use a little vibe-coded Python-script:\nOpen and parse the recording Do a \u0026ldquo;fourier transform\u0026rdquo; to get the frequencies of the signal Search for the frequency with the maximum magnitude Plot both the signal over time, and also its frequencies Mark the frequency with the maximum magnitude in the frequency plot Display the frequency also in the time plot by putting a red line after each period The result:\nFigure 71: Automatic symbol rate detection What do we observe? The frequency with the maximum magnitude is 989Hz. We can see a peak under the red line in the frequency domain plot. When looking in the time plot, we see one red line for every peak. However, that\u0026rsquo;s not what we want! We want the symbol rate to match the rate of magnitude on/off flips. However, this should double the frequency, around 1.9 kHz. Let\u0026rsquo;s plot this frequency too!\nFigure 72: Automatic symbol rate detection - use DOUBLE the rate! That\u0026rsquo;s looking much better, isn\u0026rsquo;t it? – The symbol rate really is at 1,973.33 Hz! We now have everything to build an automatic decoder as we know how often we need to look out for a flip in the amplitude.\nChapter 6: Use GRC to decode the commands in real-time # We now found out everything we need to reverse-engineer the toy car receiver with GRC. I developed two iterations of decoders:\nDecoder version 1 # Figure 73: GRC decoder version 1 The signal chain consists of the following parts:\nAgain: osmocom Source: HackRF signal source Again: Frequency Xlating FIR Filter block: Complicated block to select only our frequency of 40.685 MHz frequency range Again: Complex to Mag block: Get the magnitude of our signal, which is essentially the same as the amplitude Low pass filter, Multiply Const, etc.: VERY, very overcomplicated signal processing that should clean up the signal by extracting only ones and zeros. First, it\u0026rsquo;s an automatic signal gain control that results in the signal to always having an amplitude of one and second, it applies a threshold to the signal to get a binary signal that contains either ones or zeros. QT GUI Time sink Raw, signal (very small amplitude) \u0026ldquo;Normalized\u0026rdquo; signal with amplitude being around one. It works as long as the distance between the remote controller and the HackRF is within 3 m. The binary signal which is either one or zeros. This version still misses a lot of signal processing:\nPreamble detection \u0026amp; Pulse counting Matching pulse counts to commands Somehow simulate keyboard commands to play computer games Live view\nFigure 74: Analyzed signal when pressing forward Figure 75: Analyzed signal when pressing nothing Video 11: Detecting the signal (first try) Decoder version 2 # Although the first, naive path provided some insights, it is still not possible to really detect commands. This is why, I developed another signal chain:\nFigure 76: GRC decoder version 2 General strategy for demodulation:\nRecord signal amplitude like before High Pass filter: Center the amplitude signal vertically, so it\u0026rsquo;s zero on average. AGC: Instead of developing the automatic gain control myself, I used a pre-built block. QT GUI Time Sink: We all want to know what goes on in the signal chain to ensure it runs fine! Symbol Sync: This is a very important block - it uses the incoming signal to detect the important bit flips in our signal. It reduces (aka decimates) the recorded samples to only the few relevant ones. Binary slicer: Cleans the signal to be only ones or zeros. The output is pink since the signal data type now is single characters. Correlate Access Code - Tag: Searches continuously in the signal to find the access code (the pulses of phase one of the remote command). Char to float: Converts the signal back to floating-point numbers QT GUI Time Sink: We also want to see this signal! Command decoder: This is a custom, Python block, again, vibe-coded with AI: If the correlate access code - tag block indeed finds the phase 1 part of the signal: Count rising edges (being the short pulses of phase 2) Match the recorded pulse-count to the pulse-counts of existing commands Display the decoded command in the UI window Live view\nFigure 77: Analyzed signal when pressing forward Figure 78: Analyzed signal when pressing nothing Video 12: Demo of decoder version 2 Chapter 7: Play computer games with the toy car remote # To this moment, the GNU Radio provides info on which command is transmitted and when. It\u0026rsquo;s very cool to see the command decoding in real life!\nNow how do we play computer games with it?\nTo control computer games, we need mouse, keyboard or gamepad input. Fortunately, this can be simulated using Python libraries such as pynput. My concept is:\nModify the custom Command decoder Python-block in GNU Radio companion to set up a localhost UDP server and transmit the command strings over there. External applications on the same computer then can use the decoded commands to do more things. Another, external Python script is executed while GNU Radio is running, which: Acts as a UDP client, receiving the respective command strings Presses/releases keyboard keys virtually. Result: The complete data flow \u0026amp; full demo # We now have everything – from the toy car remote controller to the computer game. The following illustration shows the complete overview:\nFigure 79: Illustration: Overview Let me show you the full demo, by:\nStarting and testing the GNU Radio processing pipeline Starting and testing the external, virtual keyboard controller Playing a computer game (in this case the old racing game FlatOut 2) Video 13: Demo: From GRC to Python Keyboard Emulation to Racing Game Conclusion \u0026amp; more ideas # It\u0026rsquo;s possible to reverse-engineer a toy car remote controller and use GNU radio companion with some custom Python scripts to play racing games! I had a lot of fun to play around with the systems and concepts in this topic.\nI learned a lot:\nHow to use my HackRF with GNU Radio companion Some basics of RF modulation and signal processing Expectations for remote controllers in the future And of course, the adventure continues! Having experience with a simple toy car remote controller enables me to do more complex projects:\nReverse-engineer an FM remote controller: I got an FM remote controller into my hands. I already started reverse-engineering it too. Since it is not as cheap and has a much better look and feel compared to the cars toy car controller, gaming with it on a computer is probably way better.\nBuilding the command decoder purely in hardware: The latency of the real toy car and its remote controller is great in comparison to our custom decoder with the HackRF and GNU Radio (I don\u0026rsquo;t have objective measurements at hand). Building the decoder using analog hardware would significantly reduce latency. I already found some band pass filters in my cupboard that help detect the 40.685MHz signal from our toy car remote. Adding some comparators chips and an Arduino may be enough to build a full-fledged receiver!\nFigure 80: A bunch of old bandpass filters Unfortunately not allowed: Steering the toy car with fake commands from HackRF The legal boundary is that we aren\u0026rsquo;t allowed to transmit with the HackRF since it is not certified like the original remote.\nHowever, it\u0026rsquo;s possible to rebuild the remote controller in GRC to simulate signal:\nFigure 81: Prototype of a toy car remote controller plus adding some noise to simulate background noise Video 14: Simulation of remote controller signal I am interested in your thoughts! - Reply with a simple Email Have a nice day,\nCarl\nAttachments # Attachment 1: The inside of the HackRF/Portapack # Figure 82: hackrf and portapack combo To understand, what\u0026rsquo;s in my hands, we have to take the device apart:\nFigure 83: Disassemble stage 1 Figure 84: Disassemble stage 2 Actually, only the green circuit board on the right in Figure 84 is the HackRF. It\u0026rsquo;s the radio and decoding part. I guess I have a fake one, but it has the same functionality as the original.\nOn the left, the so-called Portapack can be seen. On the rear, it contains the screen and the buttons from the first picture. It has a small computer and a battery, which makes the HackRF portable! With it, a lot of functionalities are already provided like watching out for signals, receiving and decoding radio signals from airplanes and boats and simple listening to AM and FM radio music. Connecting the device to the computer removes any limits (needed for the toy hack).\nAttachment 2: Inside of McQueen and the remote controller # Taking apart McQueen reveals some juicy electronics:\nThe inside of McQueen # Figure 85: McQueen Bottom Figure 86: Inside of McQueen (with the old antenna) Figure 87: Inside of McQueen with flipped PCB The most important info is the label \u0026ldquo;40MHz\u0026rdquo; at the bottom of McQueen.\nMcQueen consists of these parts:\n2x Rear wheels connected to a motor which can go forwards \u0026amp; backwards 2x Front wheels, not connected to a motor, only for steering. It has a trimmer for fine adjustment Battery compartment for 3x AA batteries (so 4.5V in sum) Antenna (I guess a little more than 30 cm long) Controller PCB with batteries, motors \u0026amp; antenna connected to it On/off switch on the bottom Initially, the antenna was broken, though functional. I fixed it by replacing it with a cable of matching length.\nThe inside of the remote controller # Look, there is again another 40 MHz label!\nFigure 88: Remote controller back side Figure 89: Remote controller from the inside (first view) Next to the control sticks back sides, one can see a PCB with a crystal and several inductors on it.\nFigure 90: Remote controller from the inside (second view) One can see the controller board with antenna, batteries and an LED connected to it.\nThe electronics of the remote controller consists of these parts:\nTwo control sticks left sticks for gas: Four positions possible: Idle, backward, forward, forward with more speed. It\u0026rsquo;s not a continuous potentiometer right stick for steering: Three positions available: Idle, left, right. Also, no continuous potentiometer. Battery compartment for 3x AAA batteries (for 4.5V in sum) There doesn\u0026rsquo;t exist an on/off switch! LED indicating if the remote is currently transmitting Antenna (around 25 cm long) Controller circuit board ","date":"29 November 2025","externalUrl":null,"permalink":"/articles/toy-car-hacking/","section":"Articles","summary":"","title":"Reverse-engineering a toy car remote to play computer games","type":"articles"},{"content":"","date":"20 November 2025","externalUrl":null,"permalink":"/tags/ai/","section":"Tags","summary":"","title":"AI","type":"tags"},{"content":"","date":"20 November 2025","externalUrl":null,"permalink":"/tags/data-science/","section":"Tags","summary":"","title":"Data Science","type":"tags"},{"content":"Hi!\nRecently, I did the course \u0026ldquo;advanced machine learning\u0026rdquo; in university. While the curriculum was mathematically rigorous, the concepts have been very cool – from the basic backpropagation algorithm to LoRA adaptations of large language models. Since the backpropagation algorithm is the basis of AI and for most optimizations and extending concepts, I programmed it some time ago from scratch in Python and the numpy library. Now I want to continue – and I want to know: How difficult is it to install a trained model on a microcontroller? And what are the limitations of AI on such a limited hardware? Let\u0026rsquo;s find out!\nDemo # This is what I\u0026rsquo;ve built!\nVideo 1: Demo showing live digit recognition You see an Arduino Uno, a microcontroller with very limited computing power, connected to a small screen and a joystick. With the joystick, I can \u0026ldquo;draw\u0026rdquo; digits, which then get automatically recognized with the AI model on the Arduino, to be seen in the number that pops up in the top left corner of the screen. You see, it\u0026rsquo;s sometimes wrong! In the article, I\u0026rsquo;ll dig into the dataset to find out why.\nEssentials/TLDR for advanced readers # Essentially, I transformed the MNIST handwritten digit dataset. Instead of training on 28x28 pixel greyscale images, I use a point-cloud of 25 points for each image. It\u0026rsquo;s like the children\u0026rsquo;s game connect the dots – I don\u0026rsquo;t store the whole image, just the coordinates. I normalize the bounding box of the dots and sort them by their y and then x coordinates. While drawing the digit on the screen with the joystick, I collect points sequentially, ensuring a minimum distance of 2 pixels between each dot, until the buffer of 25 dots are filled.\nWith 25 dots, x and y coordinate each, I have 50 input neurons instead of the standard 784. With only one hidden layer of 40 neurons and 10 output neurons, the total parameter count is 2,450. Training with momentum-based stochastic gradient descent achieves an accuracy of around 75%. During training, I use the ReLU and softmax activation function for the hidden and output layers. On the Arduino itself, to save computation, I skip the SoftMax calculation in the output layer and just use the neuron index with the maximum value as prediction.\nIn terms of storage: Even though the weights/biases are stored as 4-byte floats, they use only 9.8kByte of the Arduino Uno\u0026rsquo;s 32kByte flash memory (about 30%). During prediction, roughly 440 Bytes in values are stored dynamically, utilizing about 20% of the 2 kByte SRAM. With matrix-multiplications happening component-wise and sequentially directly on the CPU of the Arduino, the biggest impact on SRAM-usage is storing the neural network values, which takes 400 Bytes. Overall, data preprocessing and prediction on the Arduino Uno takes only ca. 70ms.\nMore coming soon # ","date":"20 November 2025","externalUrl":null,"permalink":"/articles/digit-recognition-arduino/","section":"Articles","summary":"","title":"Handwritten Digit Recognition with AI on Arduino Uno","type":"articles"},{"content":"","date":"20 November 2025","externalUrl":null,"permalink":"/tags/human-computer-interaction/","section":"Tags","summary":"","title":"Human-Computer-Interaction","type":"tags"},{"content":"","date":"20 November 2025","externalUrl":null,"permalink":"/tags/microcontroller/","section":"Tags","summary":"","title":"Microcontroller","type":"tags"},{"content":"","date":"10 July 2025","externalUrl":null,"permalink":"/tags/fpga/","section":"Tags","summary":"","title":"FPGA","type":"tags"},{"content":"","date":"10 July 2025","externalUrl":null,"permalink":"/tags/gaming/","section":"Tags","summary":"","title":"Gaming","type":"tags"},{"content":"Hi!\nIn the Smart Sensors university course, a colleague and I had the task \u0026ldquo;Develop a gesture based game controller to play a racing game\u0026rdquo;! Now you can say that\u0026rsquo;s easy: Just connect an arduino with accelerometer to the pc, and let the arduino emulate the keyboard. Noo! In our course, we dealt with so called Field-Programmable Gate Arrays, or in short FPGAs.\nFPGAs?? # These are complicated, but pretty cool computer chips that can emulate any computer hardware you want! With a small, low-cost one, you can implement IoT devices with extremely low power consumption. With a little bigger one, you can emulate old game consoles by simulating the real hardware. And with huge, high-cost FPGAs, you can simulate whole CPUs, which helps to validate your chip designs before starting the million-dollar chip-manufacturing. The first attachment shows an extensive comparison between the different computing units.\nPros and Cons of FPGAs # FPGAs are cool because of their flexibility - they essentially are programmed by a hardware description language that defines exactly how circuits are set up internally. The FPGA code is like a file storing a Minecraft world only with redstone (although a little more efficient and hand-writeable!). There exist different hardware description languages, popular are Verilog (we learned that) and VHDL.\nWith their flexibility, FPGAs allow for extreme optimization of programs. Since everything can be defined from scratch, unnecessary overhead can be completely stripped out or custom hardware ideas can be introduced.\nAlso, everything runs in parallel! – Since an FPGA is essentially a big circuit.\nThese advantages also come with their downsides: It is pretty complicated to develop FPGA code. Debugging is also not that easy. In addition to that, FPGAs are not cheap like general purpose microcontrollers.\nFPGAs! # Figure 1: The FPGA board Our Smart Sensors challenge # We were given the task to develop a controller for a racing game that depends on the gesture – in other terms, we were given our educational FPGA board, an accelerometer and that was it. (Of course we were given a lot of guidance on how to code using Verilog etc.)\nThe racing game that it\u0026rsquo;s about is the browser game: \u0026ldquo;Two Punk Racing\u0026rdquo;.\nBy reading out the accelerometers\u0026rsquo; data, we could implement several actions:\nAccelerate by tilting hand forwards Steering by tilting hand left / right Braking / Driving backwards by tilting hand backwards Also a bonus gesture: Tapping onto the acceleration sensor enables temporary speed boost (\u0026ldquo;Nitro\u0026rdquo;). Since the FPGA board can output data only using serial communication / UART, we set up the following data pipeline:\nFPGA sends characters using UART to the pc A python agent listens to the UART communication and emulates a keyboard by virtually pressing W/A/S/D keys (and N for nitro). The pressed keyboard keys control the game. We documented our process a little in this presentation. You can find a verilog code snippet in the attachment 2.\nOpen PDF in new tab Presentation Figure 2: Me presenting the glove Summary # Although getting into FPGAs was quite hard – developing the SPI and UART interface from scratch is definitely not easy – it was quite fun as it came together at the end! Also, by implementing the interfaces yourself, the understanding really improves. After the course, I wanted to play around with FPGAs some more, so I implemented a CPU interpreting the brainfuck programming language. Probably I\u0026rsquo;ll write an article in the future about it!\nI am interested in your thoughts! - Reply with a simple Email Have a nice day,\nCarl\nAttachments # Attachment 1: Processing Unit Comparison # MCU: Microcontroller Unit (Like Arduino) DSP: Digital Signal Processor (can be found in audio gear, medical devices, etc.) FPGA: Field programmable gate array ASIC: Application specific IC (integrated circuit) Figure 3: Comparison of processing units Attachment 2: Nitro Glove Verilog code sample # module top( input wire hwclk, input wire spi1_miso, input wire adxl_int1, output wire ftdi_tx, output wire spi1_sclk, output wire spi1_mosi, output wire spi1_cs, output wire led0, led1, led2, led3, led4, led5, led6, led7 ); // System clock frequency (predefined by hardware crystal) parameter CLK_FREQ = 12_000_000; // Duration of Nitro mode (2 seconds @ 12 MHz clock) parameter NITRO_TIMEOUT = CLK_FREQ/20; // SPI control signals reg [5:0] spi_address = 6\u0026#39;h00; reg spi_read_write = 1\u0026#39;b0; reg spi_start = 1\u0026#39;b0; wire spi_ready; reg [7:0] spi_data_in = 8\u0026#39;h00; wire [7:0] spi_data_out; /////////////////////////////////////////////////////////////// // DEFINE ACCELEROMETER SPI INTERFACE // Accelerometer raw and filtered axis values \u0026#34;variables\u0026#34; reg [15:0] x_axis, y_axis; reg signed [15:0] x_axis_filtered = 0; reg signed [15:0] y_axis_filtered = 0; reg [7:0] x0, x1, y0, y1; // Instantiate an SPI interface to communicate with ADXL345 (CPOL=1, CPHA=1) // Module is defined in another verilog file spi_module #( .CPOL(1), .CPHA(1), .SCK_DIVIDE(24), .CPU_CYCLES_BETWEEN_SPI_COMMUNICATIONS(2) ) spi_inst ( .clk(hwclk), .start(spi_start), .ready(spi_ready), .read_write(spi_read_write), .address(spi_address), .data_in(spi_data_in), .data_out(spi_data_out), .spi_clk(spi1_sclk), .spi_mosi(spi1_mosi), .spi_miso(spi1_miso), .spi_cs(spi1_cs) ); /////////////////////////////////////////////////////////////// // DEFINE UART / SERIAL COMMUNICATION TO USER COMPUTER wire uart_clk; reg uart_clk_prev = 0; reg [7:0] uart_buf = 8\u0026#39;h78; reg en = 0; wire uart_busy; // Divide 12MHZ hardware clock down to 9600 baud (needed for UART) clock_divider #(.DIVIDE_BY(1250)) uart_clk_gen ( .clk_in(hwclk), .reset(1\u0026#39;b0), .clk_out(uart_clk) ); // Instantiate UART configuration and transmission // Module is defined in another verilog file uart_tx_8n1 uart_tx_inst ( .clk(uart_clk), .en(en), .Data(uart_buf), .busy(uart_busy), .uart_tx(ftdi_tx) ); /////////////////////////////////////////////////////////////// // DEFINE FINITE STATE MACHINE (to have some kind of control flow and avoid everything being parallel) // FSM clock: 1Hz tick wire test_clk; reg test_clk_prev = 0; clock_divider #(.DIVIDE_BY(CLK_FREQ/60)) fsm_clk_gen ( .clk_in(hwclk), .reset(1\u0026#39;b0), .clk_out(test_clk) ); // FSM control variables reg [7:0] program_counter = 0; reg sensor_ready = 0; // Nitro mode trigger using ADXL345 INT1 reg nitro_mode = 0; reg [25:0] nitro_counter = 0; reg adxl_int1_prev = 0; reg adxl_int1_rising = 0; /////////////////////////////////////////////////////////////// // Use accelerometers tap detection to enable turbo mode! // Detect rising edge on tap interrupt pin always @(posedge hwclk) begin adxl_int1_rising \u0026lt;= (~adxl_int1_prev) \u0026amp; adxl_int1; adxl_int1_prev \u0026lt;= adxl_int1; if (sensor_ready \u0026amp;\u0026amp; adxl_int1_rising) begin nitro_mode \u0026lt;= 1; nitro_counter \u0026lt;= NITRO_TIMEOUT; end if (nitro_mode \u0026amp;\u0026amp; nitro_counter \u0026gt; 0) nitro_counter \u0026lt;= nitro_counter - 1; else if (nitro_mode \u0026amp;\u0026amp; nitro_counter == 0) nitro_mode \u0026lt;= 0; end /////////////////////////////////////////////////////////////// // IF GLOVE TILT DETECTED IN SPECIFIC DIRECTION, // THEN SEND CHARACTER OVER UART TO COMPUTER // Gesture recognition thresholds parameter signed [15:0] X_LEFT_THRESH = -16\u0026#39;sd70; parameter signed [15:0] X_RIGHT_THRESH = 16\u0026#39;sd70; parameter signed [15:0] Y_FWD_THRESH = 16\u0026#39;sd70; parameter signed [15:0] Y_BWD_THRESH = -16\u0026#39;sd70; reg [7:0] gesture_char; always @(*) begin if (nitro_mode) gesture_char = 8\u0026#39;h6E; // \u0026#39;n\u0026#39; else if (x_axis_filtered \u0026lt; X_LEFT_THRESH) gesture_char = 8\u0026#39;h6C; // \u0026#39;l\u0026#39; else if (x_axis_filtered \u0026gt; X_RIGHT_THRESH) gesture_char = 8\u0026#39;h72; // \u0026#39;r\u0026#39; else if (y_axis_filtered \u0026gt; Y_FWD_THRESH) gesture_char = 8\u0026#39;h66; // \u0026#39;f\u0026#39; else if (y_axis_filtered \u0026lt; Y_BWD_THRESH) gesture_char = 8\u0026#39;h62; // \u0026#39;b\u0026#39; else gesture_char = 8\u0026#39;h78; // \u0026#39;x\u0026#39; end /////////////////////////////////////////////////////////////// // FSM logic always @(posedge hwclk) begin if (test_clk \u0026amp;\u0026amp; !test_clk_prev \u0026amp;\u0026amp; spi_ready) begin program_counter \u0026lt;= 0; test_clk_prev \u0026lt;= 1; end else if (!test_clk \u0026amp;\u0026amp; test_clk_prev) begin test_clk_prev \u0026lt;= 0; end else if (program_counter == 0) begin // tap_threshold spi_address \u0026lt;= 6\u0026#39;h1D; //spi_data_in \u0026lt;= 8\u0026#39;h48; // Bigger tap threshold spi_data_in \u0026lt;= 8\u0026#39;h50; // Lower tap threshold spi_read_write \u0026lt;= 0; spi_start \u0026lt;= 1; program_counter \u0026lt;= 1; end else if (program_counter == 1) begin spi_start \u0026lt;= 0; program_counter \u0026lt;= 2; // ****************************************************** // MORE FSM STATES IN BETWEEN: Read accelerometer X values using SPI // ****************************************************** end else if (program_counter == 22 \u0026amp;\u0026amp; spi_ready) begin x1 \u0026lt;= spi_data_out; x_axis \u0026lt;= {spi_data_out, x0}; // Applying low pass filter (\u0026#34;exponential moving average filter\u0026#34;) // This reduces noise in the accelerometer measurement data if (($signed({spi_data_out, x0}) - x_axis_filtered \u0026gt; 100) || (x_axis_filtered - $signed({spi_data_out, x0}) \u0026gt; 100)) x_axis_filtered \u0026lt;= $signed({spi_data_out, x0}); else x_axis_filtered \u0026lt;= x_axis_filtered + (($signed({spi_data_out, x0}) - x_axis_filtered) \u0026gt;\u0026gt;\u0026gt; 1); // ****************************************************** // MORE FSM STATES IN BETWEEN: Read accelerometer Y values using SPI // Also apply filter on Y-values too // ******************************************** end else if (program_counter == 27 \u0026amp;\u0026amp; uart_clk \u0026amp;\u0026amp; ~uart_clk_prev) begin if (!uart_busy \u0026amp;\u0026amp; en) en \u0026lt;= 0; end uart_clk_prev \u0026lt;= uart_clk; end // LED debug indicators assign led0 = test_clk; assign led1 = spi_start; assign led2 = !spi_ready; assign led3 = uart_busy; assign led4 = sensor_ready; assign led5 = adxl_int1; assign led6 = adxl_int1_rising; assign led7 = nitro_mode; endmodule ","date":"10 July 2025","externalUrl":null,"permalink":"/articles/gesture-controlled-game/","section":"Articles","summary":"","title":"Nitro Glove: Gesture-Controlled Gaming using FPGA on glove","type":"articles"},{"content":"","date":"14 April 2024","externalUrl":null,"permalink":"/tags/databases/","section":"Tags","summary":"","title":"Databases","type":"tags"},{"content":"","date":"14 April 2024","externalUrl":null,"permalink":"/tags/grafana-dashboard/","section":"Tags","summary":"","title":"Grafana Dashboard","type":"tags"},{"content":"","date":"14 April 2024","externalUrl":null,"permalink":"/tags/lora-radio-communication/","section":"Tags","summary":"","title":"LoRa Radio Communication","type":"tags"},{"content":"The aim of this project: Automatically monitor a beehive to assist a beekeeper in interpreting the bees health and current state in short and long terms.\nStarting situation: As I got a lot into electronics and making in the high school, my teacher provided me with the possibility to do an extra learning project to contribute to my high school diploma 2021. And we do beekeeping as an extracurricular at our high school!\nWhy am I doing this: I really like electronics, understanding systems and using / creating them. While playing around with Raspberry Pis / Arduinos and sensor kits, my head filled with ideas. My grandma was a beekeeper, so I always had a good connection to bees. My teacher actually had the idea of monitoring the bees at the school.\nMy ambition: Besides getting into the material myself and trying out different protocols, techniques and parts, I want to have a nice, elegant solution in the end.\nRoadmap: The project is divided into two parts as I built two mostly completely different systems for monitoring.\nThis repository: Summary / loose documentation of the project to store experiences into.\nCredits: Many thanks to my kind teacher for supporting me, not only for making this project even possible, but also for the answering of all kinds of questions regarding beekeeping and funding this project. Also thanks to the Schülerforschungszentrum Hamburg for supporting / aiding me building and testing the weight scale and the power supply.\nMy school also wrote an article about my project:\njohanneum-hamburg.de Johanneum – Carl digitalisiert die Johanneums-Bienen Die Gelehrtenschule des Johanneums ist ein staatliches Gymnasium in Hamburg-Winterhude mit altsprachlich-humanistischem Schwerpunkt. https://johanneum-hamburg.de/index.php/mint-faecher-nachrichten/1036-carl-digitalisiert-die-johanneums-bienen Honey Pi (first solution) # Figure 1: Finished solution version 1 Figure 2: Finished solution version 1, electronics Figure 3: HoneyPi system The first solution, I built with the help of the honey-pi.de-tutorial. It is a very nice website from kind people sharing their experiences monitoring their bees with a Raspberry Pi system. Next to the Pi, the main components are a solar panel, car battery, cellular surf-stick for data upload and the weight scale as well as an inside and outside temperature and humidity sensor. The solar panel charges the car battery, the car battery powers the Raspberry Pi, the Raspberry Pi collects the sensor data and uploads it to the Thingspeak IoT webservice.\nBreadboard setup # Figure 4: Breadboard setup (screenshot from the high school diploma documentation) IoT box setup # Figure 5: IoT box setup (screenshot from the high school diploma documentation) Weight scale setup # Figure 6: Weight scale setup (screenshot from the high school diploma documentation) Online service # Figure 7: Thingspeak IoT software solution from Mathworks Figure 8: Additional own website utilizing Thingspeak API for more beautiful graph display. (I never really used it) What are the systems components # Small car battery, solar panel, charging module Raspberry Pi Zero (not Zero W) mini linux computer, containing prebuilt Python agents for data upload Sensors: DHT22 sensor for temperature \u0026amp; humidity inside the beehive (custom case for bee protection) DHT22 sensor for temperature \u0026amp; humidity outside Bosche H30a weight cell + HX711 weight cell amplifier for continuously weighting the whole beehive Voltage divider for checking battery voltage (of around 12V) Uploading via USB surf stick with sim card over mobile radio Thingspeak webservice from Mathworks Stores data for one year for free Provides nice Rest API for communication between Raspberry Pi \u0026amp; Thingspeak Modular dashboard visualization of the data on graphs \u0026amp; widgets Own website that gathers the data from Thingspeak and displays it in a prettier way Result # This first solution took around one year to build. I submitted this to my high school exam and got 15/15 points. Yeah!\nAdvantages of the system:\ncompletely autonomous, if you ignore mobile radio connection strength and choose a free mobile phone contract (yes, there are free ones out there xD) beefy power supply \u0026amp; a lot of processing power Convenient webservice with easy configuration Disadvantages of the system:\nLinux \u0026amp; this much processing power on an IoT device is overkill (I do not need a lot of data analysis onboard). The IoT device could run on a lot less energy. I did not really understand, what those Python scripts really do, and I didn\u0026rsquo;t trust them Shaky beehive position, as it was placed on one small weight cell Raspberry Pi always fully on (ca. 150mA at 5V (0.75W)), no sleep function Dependency on ThingSpeak online service with its one-year data storage It\u0026rsquo;s not really my solution xD Especially in my implementation:\nBad connection to sensors because of using breadboards as permanent circuit Somehow the system didn\u0026rsquo;t work for really long, and I was annoyed and didn\u0026rsquo;t know what to do DHT22 sensor inside the beehive was broken as formic acid for bee treatment destroyed it. Also, in beforehand it was always wet because of condensation water in the top of the hive. Other experiences building the project\nDifficulties with high temperature dependency in weight the beefy car battery still ran out HiveCom (second solution) # Figure 9: Logo! Figure 10: Setup Figure 11: The electronics box Figure 12: Temperature sensor for the beehive interior After my high school exam, the Honey-Pi solution didn\u0026rsquo;t work properly after a short period of time, and I decided to start a completely new project. I didn\u0026rsquo;t really know why exactly I spent so much time/energy on it, but somehow I wanted to end this project recently and get it finally working, so now its 2024. I spent a lot of my free time between the study semesters (5th / 6th) of computer science \u0026amp; engineering on this, and I am proud that it finally is done! It installed at the hive on April 11th, 2024!\nIn this solution, I did everything by myself:\nIoT device at the beehive Lora/Wi-Fi Gateway device Self-designed LoRa \u0026ldquo;protocol\u0026rdquo; Self-setup docker environment on virtual server with: Influx database Grafana dashboard visualization Illustration of the self-designed HiveCom system # Figure 13: Illustration showing the whole system The system consists of four big parts:\nIoT node (at the beehive) # Figure 14: Electronics box Figure 15: Electronics box, top Powered by manually rechargeable LiPo batteries (2x 2500mAh at 3.7V in parallel) ESP32 as microcontroller for sensor data handling \u0026amp; upload Sensors DHT22 for outside temperature \u0026amp; humidity measurement DS18B20 (watertight) for inside temperature measurement 4x smaller Bosche H10a weight cells plus 4x HX711 weight cell amplifier chips (this time for weight cells instead of one for better stability) ATtiny84 as second microcontroller for HX711 management (The ESP32 has not enough pins to connect all HX711-chips, so I used an ATtiny as an IO-middle-man that connects the HX711 chips to the ESP32 as an I2C slave) Periodic data upload over LoRa For the schematic, see \u0026ldquo;IoT Node Schematic.pdf\u0026rdquo; file in this repository\nFeatures:\nMaybe half year of battery power supply by using deep sleep mode (I will see) Debug mode for checking sensor values directly at the hive Changing upload interval via debug mode button (1min, 5min, 20min, 60min, 120min, 360min) Easy access to components \u0026amp; see-through case for extra fancyness xD LoRa/Wi-Fi gateway # Figure 16: LoRa Gateway Small device in the school that receives the LoRa message broadcasted from the IoT node. It is connected to the school Wi-Fi and uploads beehive \u0026amp; maintenance data to my Influx database.\nYour browser does not support the video tag. Video 1: Moving splash screen with pixel art! Features:\nLeft-button function: See most recent uploaded sensor values Middle-button function: View upload log or maybe errors Your browser does not support the video tag. Video 2: Small integrated log screen Right-button function: RICK ROLL video! (I didn\u0026rsquo;t know what to do with the third button that I\u0026rsquo;ve soldered on before I had a plan) Your browser does not support the video tag. Video 3: Rick Roll! Self-designed LoRa \u0026ldquo;Protocol\u0026rdquo; # LoRa is a nice technology by Semtech that enables long range, low energy data transfer of small payloads. In the maker world, there exist a lot of boards that contain LoRa-chips that are relatively easy to interface with in the Arduino IDE. I chose an ESP32 microcontroller board with a LoRa chip (also battery charging capability). The ESP32 is a powerful chip with versatile features, perfect for IoT-devices. And it is also programmable over USB via the Arduino IDE (or PlatformIO).\nHowever, LoRa does not have a builtin security, and you have to broadcast it to everyone. At the Gateway-side, I only want to see the data from my IoT-Node. So I kind of designed my own security solution:\nOn the IoT-Node\nThe IoT-Node under the beehive collects all the data, and then appends it in float-format step by step to a string, comma-separated. This is the payload. The node generates a hash based on the payload string plus password-string appended. The password is stored in the flash memory of both the IoT node and gateway. The node sends the payload plus hash appended over LoRa (using spreading factor 12, standard bandwidth) On the Gateway\nIf a LoRa package arrives at the gateway:\nThe gateway stores the whole LoRa packet and separates payload and hash The gateway also tries to generate the same hash of the packet with its equal password in flash If the hash is equal, I treat the payload as authenticated. If it is not authenticated (or the format/syntax of the package is wrong), I treat it as someone else\u0026rsquo;s package I upload only authenticated values to the Influx database But I also store the not-authenticated messages separately in the database! (maybe, an alien wants to text me? xD) Why not use LoRaWAN? I also could have chosen to just use the standard LoRaWAN protocol, but then I also would have needed to spend 100 euros or so on an open LoRaWAN gateway. I had two LoRa-capable devices lying around already, so I chose using them instead. (But TheThingsNetwork is also a very cool tool too!)\nWeb solution # Figure 17: The grafana dashboard for displaying the measurements Webservice architecture:\n1-blu.de virtual linux server (Ubuntu 20 LTS) with SSH Docker: Portainer for docker management Nginx reverse proxy for subdomain management Influx DB for storing hive data (and maybe trigger webhooks for notifications in the future) Grafana for displaying data from Influx DB To get this project finally done, I had to do a lot of smaller projects that somehow converged to the final system. I didn\u0026rsquo;t plan these steps, but at the end it somehow came all together.\nSide projects that converged to this system # 1. Testing \u0026amp; playing around with Lilygo ESP32 board # Figure 18: Lilygo pin assignment Hehe, I bought this one 2x from Aliexpress. There is near to no documentation of this! But the ESP32 is a pretty standard chip and there is also a pre-made Arduino IDE profile for this board. I like it because it has a lot of interfaces:\nWi-Fi / Bluetooth (I just use Wi-Fi at the gateway side) SD-Card interface Small OLED screen (0.92\u0026quot; 128x64 pixel, fun little display!) Obviously LoRa Battery supply and charging circuit! And an on/off switch for the battery. Actually I didn\u0026rsquo;t know whether my LiPo battery cells fit to the battery charging circuit, but I hoped and tried it out, and I had luck! Although the circuit gets pretty warm while charging xD 2. Think about how to get around with weight calibrating \u0026amp; temperature dependency # The project stood still for long time, because I didn\u0026rsquo;t know how I should get around with this problem!\nIn the Honey Pi project I experienced a lot of temperature dependence, and I was not happy with the stability of the weight scale.\nSo for the new solution, I first bought cheap weight cells from Amazon (those cheap sensors you find in weight scales for humans). But somehow I didn\u0026rsquo;t know how to use them properly, or they are awful, and I was not lucky. (See failed experiment at the bottom on this page)\nThen, I bought very expensive weight cells (H10a from Bosche), sum was around 160 euros for just the cells. I really wanted to get every precision I can (even if my teacher said that it should not be needed). By the way, the school\u0026rsquo;s beekeeping money paid it, thanks!\nAnother problem was, that I didn\u0026rsquo;t know how to calibrate. So if you hook up weight scale to an HX711 amplifier and then the arduino, you get raw values ranging from negative levels to the millions. In theory, you should just build the scale, put nothing on it and take this raw value as a zero value. Then, you put a known weight on it (Higher weights are better for precision) and scale the value you get from the HX711 by the coefficient you self-calculate. It should be pretty easy, just linear. There are a lot of tutorials for this on the internet.\nBut there are problems:\nI have now 4 weight scales and the old wooden boards are bent over time outside. The weight is distributed not in an equal way on each weight cell. What should I do with temperature dependence? How much do I have to care about it? Is it linear? Quadratic? Logarithmic? And I have no precise temperature sensor! Where the heck do I get a precise, high value weight from to calibrate? So I spend a few days at the weekend in the Schülerforschungszentrum Hamburg, collecting data the best I could via ultra precise chemical weight scales with multiple buckets of water. You find the data in the \u0026ldquo;Bosche Wägezellen H10A Analyse.xlsx\u0026rdquo; file. I wrote some small python scripts to analyze the values that I got through the Serial terminal.\nBy the way, for every value you want to get from the HX711, you pull like 5 values and get the average of it as it is floating a little (big problem on the cheap weight cells)\nBut at the end, I found out that the expensive 4x H10a weight cells had a weight difference of ±20g when I put it in the freezer or lightly blew on it with a hot air gun. This is a lot less temperature-dependent then I expected and less than the first HoneyPi system that used the single H30a weight!\nAnd then I calibrated the scale with my monitor speakers, which should be 10.4 kg (the datasheet says that, I take that as true xD)\nFigure 19: Calibration setup 1 Figure 20: Calibration setup 2 Figure 21: Calibration setup 3 Figure 22: Calibration setup 3, second photo 3. Choosing temperature \u0026amp; humidity sensors # For the outside I just reused a DHT22 from the old HoneyPi system (seemed to work well). And I thought I just take a watertight temperature sensor for the inside and knock of the humidity sensor for the inside. I used the DS18B20, pretty standard sensors in the maker world backed by good arduino libraries.\n4. Buying more parts # Yes, you always forget something to buy :/. It is impossible to plan the whole project completely with all costs in advance\n5. Building the weight scale # So the basic principle of the weight cell is this: Figure 23: Basic principle of bending weight cell You just measuring the stretching / compressed values.\nI also was worried about this scenario: Figure 24: Bending weight cell issues result in wrong weight measurements But I kind of solved the problem a little by having bigger holes at the top board and cutting the screw heads off:\nFigure 25: Bigger holes are the solution I had a discussion with some friends on how to get all the weight on all four weight cells without any warping in the wooden boards and parasitic stabilities. Then, I took the Mechanics I module in the university, and it actually helped me to understand problem in a better way lol\nO, this was a little tricky, but I really was interested in getting an own I2C slave working. The ATtiny is also programmed with the arduino bootloader to enable working with standard arduino libraries. To ensure low current draw, I used an ESP32 GPIO pin as the power supply for the ATtiny and the HX711 chips (the datasheet says that you can drive around 20mA with a GPIO and that\u0026rsquo;s enough). I had a lot of problems getting the I2C connection working as I forgot to turn on the ATtiny power supply on and wait for it to boot lol. But then, trying out with and without pull up resistors, checking the OLED display which is also connected via I2C, it worked like after 8 hours of developing / troubleshooting!\nDuring troubleshooting, I got a lot of weird display images, as the display is also connected over I2C. xD\nFigure 26: OLED display errors I decided to outsource the weight scale calibration calculations to the ESP32 and let the ATtiny just upload the raw values via I2C.\n6. Soldering the circuit \u0026amp; finding a small, somehow \u0026ldquo;waterproof\u0026rdquo; case # Yeah, I am not proud of this soldering, but it works. It actually was the easiest part I think. And I was lucky not to short anything.\nFigure 27: A lot of wiring 7. Getting into the schools\u0026rsquo; enterprise WPA2 Wi-Fi # As I still am in close contact with my school teacher who is IT administrator, he gave my Wi-Fi credentials in advance.\nI had a lot of respect for this problem as there is little resource out there. I didn\u0026rsquo;t know if I need to switch the whole project to the ESP-IDF development environment because of not having some hardware Wi-Fi driver capabilities.\nOn standard operating systems, they include a lot of configuration options, but at the microcontroller you have to first even find them by looking into the Wi-Fi driver code!\nThe arduino Wi-Fi library is actually pretty good, but the documentation is not great, and I did not get more knowledge from reading the library source code.\nI have ended up trying a lot of different configurations and trying to put a certificate into flash memory.\nI sat in the schools\u0026rsquo; library for like 6 hours and tried and tried and became despair, but the patience was worth it! However, I got a local dhcp ip address and internet connection, and it worked!\n8. Programming UI on IoT node and gateway # Yes, took a while, but it was not that difficult. It was rather fun to add more and more convenience features and a \u0026ldquo;booting\u0026rdquo; animation!\nFigure 28: Boot screen (the dots move actually yeah) 9. Designing a case for the gateway (3D-printed, Fusion 360) # I was lucky, because I swallowed the price to buy a caliper in advance!\nFigure 29: Measuring the dimensions of the gateway pcb Figure 30: Designing a case with Autodesk Fusion 360 to 3D-print Figure 31: 3D-print the case 10. Getting rick roll video onto gateway device # Hehe, it was fun to download the video from YouTube in MP4, cut it to like 17 seconds, convert it to a lot of BMP images and scaling it down to black / white 128x64 pixels.\nThere is a nice website out there that puts those pictures to arduino c++ code format. And then you just have to play around with pointers to go through the pictures and the delay in between to get the right speed.\nMaybe I upload a tutorial on how to do this sometime as there is no tutorial just yet\nFigure 32: Getting the Rick Roll prank video up and running 11. Approximate battery status from battery voltage # I used a dual 100kOhm voltage divider for this. Yes, there is a some more current leakage through this, but I think it is ok.\nWith the help of the graph on Adafruit\u0026rsquo;s website (https://learn.adafruit.com/li-ion-and-lipoly-batteries/voltages) I approximated the voltage with two linear functions. I hope, that works as I didn\u0026rsquo;t try it out!\n12. Play around with Arduino InfluxDB library # O yeah it was nice to see the information flow path growing at this point\n13. Configure linux server docker environment \u0026amp; configure Nginx to access the services from the internet # A kind friend of mine helped me get into this very quick, he helped me to set up my server and kind of get basic security in parallel.\n14. Play around with InfluxDB \u0026amp; Grafana # Figure 33: InfluxDB online configuration InfluxDB and Grafana are very nice tools! And it is incredible that they are free! Also, it is fun to link the data together and there is easy satisfaction when combining API keys xD\n15. Raspberry Pi Home Weather Station Test # To test the InfluxDB / Grafana software, I hooked up a Raspberry Pi at home with a SenseHat lying around (very nice extension board by the way). I put some small python scripts in systemd I think, so I just upload some sensor data like temperature and humidity from the position under my bed. It actually worked, and I forgot it, but then somehow the humidity sensor stopped working. I think, humidity sensors are kind of fragile.\nHere I learned how to set up nice Grafana queries and transformations to get nice, colorful graphs! I think I need to do more with this and try out predicting data based on machine learning?\n16. Configure API keys \u0026amp; database for data upload for Hivecom # pretty easy at this point\n17. Set up the Grafana dashboard # It is so satisfying to configure the widgets and graphs and make the data pretty as the whole information flow is complete!\nWith all kinds of fancy data transformations and filters you can serve the data on the silver platter!\n18. Test run for a few days \u0026amp; continuously analyzing # Pretty boring, it just worked lol. But I was worried that I approximated the battery in a wrong way, we will see.\n19. Testing wrong signature LoRa messages # I obviously need to try to send messages to myself to be able to record someone else\u0026rsquo;s messages after the final deploy!\n20. AND THEN FINALLY INSTALLING IT AT THE HIVE! # April 11th of 2024!\nI am very happy that the system is finally in use, but I am still not allowing myself to take too early analyzing results.\nTo Do # configure automatic whatsapp notification on hive alarms First analysis of the data # Coming soon. We will see!\nFailed experiments # Using cheap weight cells # Figure 34: Cheap weight cell measurement setup I tried using cheap weight cells (ca. 13 euros sum) from Amazon, but the values had a difference of ~130g from the real weight. Also, for every measurement, the values differed from the previous values a lot and I didn\u0026rsquo;t got a clear outcome. I wanted more precise values.\nDesigning a custom DHT22 sensor housing for the inside of the hive # Figure 35: Alternative DHT22 casings Figure 36: Switching the DHT22 case I\u0026rsquo;ve heard that bees close holes of specific sizes in the beehive to prevent other insects to getting inside the hive. For this, they use their self produced propolis, that is also antibacterial and protects from diseases. Research on the internet did not reveal a specific hole size that gets closed, however it should be between 1-6 mm. I designed different cases for the DHT22 to test that out.\nHowever, it didn\u0026rsquo;t matter because of the humidity sensor always being wet and also broke ¯\\_(ツ)_/¯\nGetting good temperature-weight-calibration # Figure 37: Embarrasing experiment to calibrate the weight cell with temperature Probably a little embarrassing, as I tried to measure the temperature-dependency with a hair dryer in a verrry DIY way. I tried to put one DHT22 directly next to weight scale between the wooden boards, but it worked as well as it looked xD. I put the books around the weight scale to prevent the warm air going fast outside the scale again.\nHere is some data I collected in this experiment:\nFigure 38: Sketchy data However, sometimes the upload process failed, and I didn\u0026rsquo;t know what to do exactly with this data. Having a low-temperature oven or a calibrated temperature sensor would help xD\nI am interested in your thoughts! - Reply with a simple Email Have a nice day,\nCarl\n","date":"14 April 2024","externalUrl":null,"permalink":"/articles/hivecom/","section":"Articles","summary":"","title":"Remote beehive monitoring from scratch","type":"articles"},{"content":"","externalUrl":null,"permalink":"/authors/","section":"Authors","summary":"","title":"Authors","type":"authors"},{"content":"","externalUrl":null,"permalink":"/categories/","section":"Categories","summary":"","title":"Categories","type":"categories"},{"content":"","externalUrl":null,"permalink":"/series/","section":"Series","summary":"","title":"Series","type":"series"}]