I recently had a tech support query regarding the Windows 10 Mail application – the same may apply to Windows 11 as well. The user’s ISP was making a change to their mail service, and this required the user to change the SMTP/IMAP/POP configuration that their mail client uses. This sounds like the kind of change that should be fairly straightforward, but Windows Mail makes it very frustrating, due to what I assume must be an unconsidered edge case in automated testing. Any decent QA engineer would find this almost instantly… which implies….. anyway…

The user had tried to make the change to their configuration, by going into Settings / Email & app accounts, then choosing their email account, and clicking Manage. All good so far.

At the next popup, you’ll see the account name, with a section ‘Change mailbox sync settings’, under which is a link ‘Options for syncing your content’. It’s this that you’d click, then make changes.

So you do this, make the changes – in this case, changing the IMAP/POP port, and enabling SSL – click Done, then Save.

The Mail application then tries to sync, and this is where the problem lies. If you have got the settings wrong, bad luck – the application sticks in some bad state, where it can’t sync, is stuck in sync progress, and the ‘Options for syncing your content’ link is greyed out, ie disabled.

You’ve inadvertantly broken your config, and now cannot fix it.

I spent quite some time trying to get back into the configuration that the user had mistakenly changed, but was frustrated at every attempt. I restarted the application. It immediately tried to sync, and the link was disabled. I restarted Windows. Same situation: disabled. I googled, but the suggestions found were unscientific, to say the least.

Upon clicking the ‘Sync Error’ icon, I was given the option to re-enter the user’s mail password. This was not what I wanted to do – since the user had not remembered it. Exiting that dialog, I could try to Manage the Account again, but the link I needed was still disabled.

Thinking of possible heuristics & strategies for getting tech working again, I considered how this dialog may have been written (this is pure speculation). When displayed, if the sync status is ‘in progress’, disable the link, since it’d be hard to manage changing configuration state when that state is being used. So, how to prevent the sync status from being initialised, since it goes ‘in progress’ immediately on startup?

Disconnect the computer from the network, reboot, and try again.

Bingo! In the absence of a network connection, Mail doesn’t try to sync on startup, so the status is ‘uninitialised’ – and the link is enabled. I clicked it, corrected the configuration, re-connected to the network, rebooted, and it’s now working.

I also installed and set up Thunderbird, for a quality solution.


This blog has been a little quiet of late, but that’s because I’ve been busy experimenting. I prefer to work on experiments in the background until they’re sufficiently developed and might actually be workable, before writing about them here. This project hasn’t really reached that criteria yet, but I think it’s worth sharing.

I had the first ideas for this in September 2020, while working on Parachute. It’s fair to say I put Parachute to one side while getting this started, and haven’t picked it up again properly yet. I used to work on Parachute mostly during my daily commute, and since the COVID-19 lockdown, no longer had this as a part of my day. My daily routine was completely restructured, and so Parachute opportunities changed.

I’d started developing an application at work using the Rust programming language, which I hadn’t used before. The benefits of Rust for determinism, low-level access to hardware, memory safety and concurrency are very strong indeed. Had I not been exposed to it, I would have started developing this in C++, as I had for Parachute. Using Rust for digimorse was definitely the right choice. I’d still be wrangling boost and CMake if I’d chosen C++, and wondering where low-level software development all went wrong. Rust is a game changer. I like golang as well: it could have been a contender, but Rust is far safer.

Digital communications techniques + Morse code = digimorse

So, what is this? From the project’s README.md:

Morse code was the first means by which messages were sent by Marconi, as he performed his experiments in the 1890’s. This was achieved using a crude spark gap transmitter. In 1913, Edwin Armstrong and Alexander Meissner used thermionic valves to build a continuous wave oscillator which refined the transmission mechanism. This form of transmission, known by the acronym CW, was widely used for military, maritime and commercial telegraphy, gradually being replaced by more modern systems of communication. Its last maritime use was in 1999.

It is still, however, used widely by amateur radio operators, such as myself. Although the electronic circuits used have been refined as far as they can be, the essential continuous wave transmission mechanism has remained unchanged.

This project is an attempt to modernise that.

I’m some way off a release, since only about a third of the architecture is present; there’s no graphical user interface – it’s not usable yet. However, I hope the basic idea might work.

More details of what I’m attempting follow after that paragraph in the README.md… please read on!

For more information, the github repository is at https://github.com/devzendo/digimorse and the documentation describing the system in more detail (as it exists so far) is The digimorse Communications Protocol. [PDF].

Would you like to collaborate?

If you’re interested in helping develop the ideas or the code behind this, please get in touch! As always, all comments are welcome – even if you think the idea won’t work: after forming a hypothesis, one should always seek to falsify it, so as to not waste effort trying to prove it.

I was trying to build cargo-instruments, to analyse a Rust program for various leaks, but was plagued by an odd problem:

undefined symbols for architecture x86_64:
“_libiconv”, referenced from:
_git_path_iconv in libgit2.a blah blah blah

I traced it down to a conflict between the Rust git2 package and libiconv. Seems like the git code has a hard-coded path to the iconv headers, expecting them to be in /opt/local/lib, which is where MacPorts had stored its version of the iconv library.

By temporarily renaming /opt/local/lib/*iconv* (libiconv.2.dylib, libiconv.a and the libiconv.dylib symlink), I was able to build successfully.

I’m moving house again. Not physical house, crikey no, that was the most harrowing experience of my life last time.. No, code hosting.

I started hosting my code on Google Code, which was a nice hosting service, but like many Google products, they closed it since they couldn’t offer the features that other hosting providers such as GitHub and BitBucket did. Read about it here.

So I moved to BitBucket, mostly using Mercurial, which I still maintain is a superior and easier-to-use system than Git. And all was good, until Atlassian, the owners of BitBucket decided to stop hosting Mercurial repositories. Not only were they going to stop users creating new ones, they were going to delete all of them. From an archivist perspective, this is abhorrent behaviour. Consider that Open Source is what most technology businesses are built on, they’ve just raised a large digitus impudicus to the Open Source world. And we’ve taken note. Read the announcement here. But it’s the forum posts that are important too, developers up in arms about the vandalism Atlassian are perpetrating, with no mechanism to auto-convert their Mercurial repositories to Git, hosted at BitBucket. The community has responded splendidly, providing tools and instructions on how to perform an exodus. I have contributed one or two simple scripts to aid with it, but mostly I’m using Magnus Hovland Hoff’s superb Kick the Bitbucket script. I have over forty repositories to migrate from Mercurial to Git, and from BitBucket to GitHub. I have to convert, update my source control tools, IDEs, build servers, references between projects on my web site, and also change all the URLs of images I place on this blog. Not a small task, and it has taken me off my main project, Parachute.

Time will tell whether Atlassian sunset BitBucket as a whole. The overwhelming view in the forum is that they have sown mistrust for all their products, with users abandoning them. Hope they keep Trello. JIRA? Hmmm..

Is GitHub a safe home? I hope so. The change of ownership to Microsoft has been good for it, I feel. Has Git improved since I started using Mercurial? A little, but it’s still a pretty grim user experience. Mitigated with Syntevo SmartGit. The command line is still a Lovecraftian horror.

While updating pages of this blog, I ran some stats. The most viewed pages are those that relate how I solved a technical problem; some irksome thing that had taken me a while to sort out, so I wrote up what I’d done. It’s nice to know that I’ve helped others in some way; tech is hard. Next come the more complete radio/antenna constructional articles. Near the bottom however, are posts on Parachute.

Perhaps I just need to do smaller things, that are more easily completeable?

Just had a short break away from the constant pace of previous work – 16 months’ worth of near-daily updates – in which I caught up on the stack of magazines, articles, books I’d been meaning to get through. So away from tech for a while hence no blog post last month.

The book stack never goes down – see Michael Simmons’ article at https://medium.com/accelerated-intelligence/the-5-hour-rule-if-youre-not-spending-5-hours-per-week-learning-you-re-being-irresponsible-791c3f18f5e6 – if I could take a yearly two-week reading vacation, I would… but back in the real world..

Now I’m ready to start on Parachute 0.0.2, the rework of the node server protocol to be iserver compatible. This should mean that the emulator could work with other emulators’ iservers, and vice-versa. However, the link emulation mechanism would need additional variants, to use the mechanism used by other emulators – e.g. http://lcm-proj.github.io/ as used by Gavin Crate’s emulator (see https://sites.google.com/site/transputeremulator/Home/multiprocessor-jserver-support).

During this work, I’ll update the Hello World assembly program, and start upgrading the C++ code to C++11 as needed.

16 months after picking up this old project, I’m happy to release the first cross-platform version of my T800ish Transputer Emulator!

See the Parachute 0.0.1 release notice for more details, download links etc.

As the previous blog post here described, there have been several frustrating aspects to the development- and these have only continued since then. However, I now have a good base, with mostly fully automated build/test and release systems- so subsequent releases should be easier.

One aspect of the build has been to use Maven & Jenkins for everything- despite several components of Parachute not being pure Java. Between these two, and a moderate collection of plugins, I have a multi-platform C++ build, with deployment to Maven Central.

I have a long-standing dislike of language ecosystems building their own component build/dependency/deployment systems. They re-invent Maven – sometimes poorly, usually because of a mistaken idea that it is Java-only. I’d much rather build on the existing base of Maven than see my language experiments as in some way “special”, requiring me to unnecessarily re-invent it. Maven is not perfect, but I think it’s a good fit for Parachute. I’ll still be providing build tools as command line tools and libraries, for easy use in non-Maven systems.

In the next phase of Parachute I’ll be converting the protocol between the emulator and host I/O server to be compatible with the Inmos iServer, then getting eForth working.

But first, a rest. Happy Summer Solstice!

TL;DR: Frustration, but the end is in sight.

Parachute is composed of several separate projects, with independent versions, held in separate repositories:

  • the Transputer Emulator itself, written in C++, built using Maven/CMake/Make, which requires building and packaging on macOS, CentOS 7, Ubuntu 1604 and 1804, Raspbian Stretch, and Windows 10.
  • the Transputer Macro assembler, written in Scala, built using Maven, which requires building and packaging on macOS, Linux (one cross-platform build for all the above Linux variants), and Windows 10.
  • and eventually there will be the eForth build for Transputer, other languages, documentation, etc.

Getting all this to build has been quite the journey!

I use Maven as an overall build tool, since it gives me sane version management, build capability for all the languages I use via plugins, packaging, signing, deployment to a central repository (I’m serving all build artefacts via Maven Central).

Each project’s build runs across a set of Jenkins instances, with the master on macOS, and nodes on virtual machines, and a physical Raspberry Pi.

Each project deploys a single artefact per target OS, into Maven Central’s staging repository. So there are six build jobs, one on each node, that can sign and deploy on request.

The effect of this is that a single commit can trigger six build jobs for the C++ code, and three for the JVM-based code (since all Linux systems package the same scripts). Deployment is manually chosen at convenient points, with manual closing of the staging repository in Sonatype’s OSSRH service.

The manual deployment choices may be removed once all this is running smoothly. Since I cannot produce all platform-specific artefacts from a single Maven build, I cannot use the Maven Release Plugin.

Once the emulator and assembler are deployed for all their variants, there is a final build job that composes the Parachute distribution archives, signs them and deploys them to Maven Central via Sonatype OSSRH.

There have been several ‘gotchas’ along the way..

… the GPG signing plugin does not like being run on Jenkins nodes. It gets the config from the master (notably, the GPG home, from which it builds its paths to the various key files). So that had to be parameterised per-node.

… getting the latest build environments for C++ on each of the nodes. I’m not using a single version of a single compiler on everything. A variety of clangs (from 3.5.0 to 8.0.0) and Microsoft Visual C++ Build Tools.

… Windows. It’s just a world of pain. Everything has to be different.

So this long ‘phase one’ is almost at an end, and I hope to ship the first build very soon.

It would be ‘fun’ to see if I can replicate all the above with a cloud-based build system instead of Jenkins + VMs. However, Windows, macOS and Raspberry Pi will be problematic. Travis CI does not have CentOS or Raspberry Pi hosts; Circle CI does not have Windows, CentOS or Raspberry Pi hosts (Windows is on their roadmap).

Surely, there’s something to report this month?

Well Parachute is making progress, I now have a version that runs on Windows 10, with some small niggles. It’s building on macOS, Windows 10, CentOS 7 and Ubuntu 1604. Now working on the final packaging builds that’ll bring together the emulator, assembler, example program (which is the essential ‘Hello World’ in assembler) and eForth into a single distributable package for each platform.

I’m also starting to read up on theory relating to the Transputer, occam, and the process algebra it’s based on, Communicating Sequential Processes. More on this (a series of rough notes on the papers and books I’m reading) later.

The magnetic loop is still waiting for me to work out how to do brazing, to connect the copper pipe loop to the variable capacitor in a manner that keeps resistance to a minumum.

The digital modes transceiver keeps knocking around in my head, although I did find an archived copy of the Elektor issue that describes the LUFA library. It’s http://d1.ourdev.cn/bbs_upload782111/files_43/ourdev_663216L8BOSE.pdf

In terms of goals, my CW goal is utterly shot. I’m still keeping up with the Bulgarian Station Blagovestnik’s special event stations – I’ve managed to contact all of them so far, including this month’s via some extremely shaky CW (I must pick up CW practice again.)

I have various other fitness/study goals (abstract algebra, category theory, stoicism) and commitments that are noodling along slowly… but finding time is always the difficult bit.

One thing that’s being stuck at solidly is meditation: I’m making time for it every day (with a couple of unavoidable exceptions). As Voltaire said, “The more you read without thinking, the more you think you know a lot but the more you meditate, the more you see that you know very little.”

I know so very, very little.

If you need to visualise molecular structures, RasMol is a venerable program that should run on multiple platforms. I was recently asked to help getting it working on modern Mac OSX. There are instructions on the RasMol website, but not for modern Mac OSX. Hence, these rough notes – offered in the hope they may help others…

RasMol uses the older X11 windowing system that is no longer provided as part of macOS, so we’ll install the open source XQuartz X11 system from XQuartz.org.
Download the .dmg (disk image file), open it, and run the installer. You’ll then need to log out and log in.

Then download RasMol from its SourceForge download area. You’ll need the file:

Once this has downloaded (into your Downloads folder), you’ll need to open a Terminal, and extract its contents, with the following commands. Note that MyMacBook$ is the prompt provided by the Terminal/OS, and that each command must be entered on one line – so there’s a single ‘tar’ command that starts with ‘tar’ and ends with ‘.gz’ before you press enter:

MyMacBook$ cd Downloads
MyMacBook$ tar xzvf RasMol_2_7_5_3_i86_64_OSX_High_Sierra_19Dec18.tar.gz
(Many lines will scroll by)

We’ll put this extracted software somewhere a bit easier to get to:

MyMacBook$ mv RasMol_2_7_5_3_i86_64_OSX_High_Sierra_19Dec18 ~/Applications/RasMol_2_7_5_3

(Now it’s in your personal Applications folder).

We need a launch script, since the one that comes with the software doesn’t seem to work, since it can’t find XQuartz’s libraries. So from the terminal:

MyMacBook$ cd ~/Applciations/RasMol_2_7_5_3
MyMacBook$ nano run-rasmol.sh

This puts you into the ‘nano’ text editor, then you must copy and paste (there are three lines here):


Then press Control-X, and press Y then return to save the file. Then to make this launch script executable:

MyMacBook$ chmod 755 run-rasmol.sh

OK, nearly there.

Only kidding 🙂

Let’s create an alias to let you run RasMol a little more easily…

MyMacBook$ cd (then press return)
MyMacBook$ nano .bashrc

Again, you’re in the nano editor, so copy and paste this – note that your file may already contain lines, just add the ‘alias’ line near the end:

alias rasmol='cd ~/Applications/RasMol_2_7_5_3; ./run-rasmol.sh'

Then Control-X, and Y then return to save the file.

Righty, let’s run XQuartz (via spotlight [Command-Space]). After a few seconds, you’ll see an X11 terminal window (xterm) appear. This is different from the usual Mac Terminal. You won’t be able to run RasMol from the Mac Terminal: you must run it within the XQuartz system, and you do this from the xterm window. Note that the OS prompt will be different in the xterm window, in my case, it is bash-3.2$ – so in the xterm, type:

bash-3.2$ rasmol (then press return)

Then you’ll see the beautiful RasMol window. It’s very different from what you’re used to on macOS, but this is how we used to use graphical programs back in the 80s.

The main RasMol window has its own menu – it’s not in the top menu bar like ‘normal’ Mac programs.

When you close XQuartz, you close ALL the X11 programs you’re running – xterm, rasmol, etc.

To open a file in RasMol, use the File menu, then Open…, then use the old-style file dialog to navigate using the ‘..’ (Parent Folder) directories to find where you’ve stored your RasMol files.


Updated 11/09/2020 with clarifications after comments by Vivian Vu, thank you!

Back in July 2017 I wrote a post on here in which I gave a rough sketch of a combination transceiver/computer that would allow me to take a single unit, antenna kit and power, and work digital modes portably, with a minimum amount of baggage. Like one might do with the Mountain Topper range of CW transceivers, but capable of operating with digital modes.

When I wrote the article, I was into JT65 and JT9. Now of course, FT8 is the mode du jour.

The DDS of choice was the ‘el cheapo’ AD9850/AD9851 boards that are available on eBay: now I’d go for the Si5351 DDS boards, with a module available from Adafruit, and also available in an ‘el cheapo’ variant! This DDS creates fewer harmonics.

I’m still very much a beginner at RF design: that is still the major risk to the project, as is the absence of copious amounts of spare time in which to work on it!

However, one risk I’d identified – making an Arduino present itself as a sound card + multiple serial devices – seems to be reducible. LUFA (Lightweight USB Framework for AVRs is a “an open-source complete USB stack for the USB-enabled Atmel AVR8 and (some of the) AVR32 microcontroller series, released under the permissive MIT License”. It has examples of Audio In/Out and Serial devices. I’m hoping it can also provide a composite device that allows the single audio I/O channel, and two serial ports (diagnostic and CAT control).

So the next action on this project is to make an Arduino Micro look like a sound card with two serial ports. It’ll be a loopback device, so whatever sound you play at it (i.e. when transmitting) will be played back to you when receiving; whatever you send on serial port A will be echoed back to you with a ‘DIAG’ prepended to it; similarly with port B as the ‘CAT’ port.

Still unknown: SSB transceiver design that’s buildable by a beginner, and that can be connected into a ADC / DAC pair. How many bits of audio do I need to sample, at what rate?

This may well require a microcontroller that’s a bit more powerful than my usual Arduino Micro – possibly one from the Teensy range.

… to be continued…