AI art

An example of AI art - a generated image of a robot painting a picture of some flowers on an easel

You may have noticed that several of my recent blog posts have featured custom AI art work related to the topic. I’ve generated these using Microsoft’s Bing AI Image Creator, which uses OpenAI’s DALL-E text-to-image model. DALL-E can generate an image based on a text prompt; for example, the featured image on this post was ‘A 1950s style robot standing in front of an easel painting a bouquet of flowers in a vase’.

These are some of the other AI art images that I’ve used recently:

AI art is controversial. It can create images in a few seconds that would take a human artist hours or days to produce. And, in some cases, these image prompts can be told to create images in the style of a particular artist, depriving them of income from a commission. It’s also notable that models like DALL-E and Stable Diffusion have been trained on copyright works, without the rights holders’ permission.

With this in mind, I’m justifying my use of AI art on some of my blog posts because I’m not an artist myself, and as an individual blogger who doesn’t make money from blogging, I wouldn’t have the money to pay a human artist. Whilst I have over 5000 photos that I’ve uploaded to Flickr, there isn’t always a relevant photo to use that I have taken. For example, in my recent post on comment spam, I decided to generate the above image of a robot converting blog posts into a tinned meat product, because I don’t have a photo that represents that. And whilst I make use of screenshots where relevant, sometimes this isn’t appropriate.

Of the AI art generators that I have used, the Bing AI Image Creator seems to be the one that gives me the best results. Any images you create are saved in the cloud, and can be downloaded for re-use. And each prompt produces four images so that you can choose the one which looks the best.

More new old posts from the archives

An AI generated image of a phoenix rising from the flames of a browser window

I’m gradually bringing back some of my old blog posts that were lost, and here are links to the latest batch that I’ve made live again:

  • Create a Safely Remove Hardware shortcut (April 2007). Another of my how-to blog posts, this allowed you to create a desktop icon that opens the Safely Remove Hardware function in Windows. Surprisingly, this still works in Windows 10.
  • Screenshots on a PocketPC (November 2005). How to take screenshots on a PocketPC/Windows Mobile device, since there wasn’t a built-in screenshot app.
  • Resurrecting a dead OS with KernelEx (May 2010). KernelEx is a compatibility layer for Windows 98 and Me that allowed apps which would normally require Windows XP to be installed. You can still download it, but it was last updated in 2011 and I expect there’s not much demand for it nowadays.
  • My favourite add-ons for Thunderbird (May 2014). Some add-ons that I used with Mozilla’s Thunderbird email client. I don’t use it anymore – we use Outlook at work and at home I tend to just use Gmail.
  • How to: get cheaper car insurance (September 2015). Another how-to guide. When reposting this, it was surprising how few sites back in 2015 used HTTPS by default; Let’s Encrypt had launched the previous year so I’m guessing Google hadn’t yet started favouring HTTPS sites in its search results.
  • A car. An actual car (September 2015). Linked with the above, the purchase of our first car. Sadly said car got written off in France in 2019, although it subsequently got back on the road with a new owner.
  • Passed (August 2015). Again, linked with the above, this was about me passing my practical driving test.
  • Expecting (July 2015). The announcement that my wife Christine was pregnant. Linked from the above posts, but I am also looking to reinstate blog posts about major life milestones as well.
  • 20 week scan (September 2015). As above.
  • Back in the driving seat… (July 2014). Restarting driving lessons, which led to me passing my test, as above.
  • Theoretically passed (April 2015). Passing my driving theory test. A lot of the above posts all link together, and so I’ve wanted to bring them all back at once to avoid creating dead links.

There are, of course, more to come. Whilst I estimate that I’ll only be bringing back around 10% of the old blog posts, that does mean around 250 posts to copy from the Web Archive and update.

Why you shouldn’t buy gift cards as presents

An AI generated image of a Christmas tree with lots of presents and gift cards underneath it by a window.

When you need to buy a present for someone, and aren’t sure what to get them, gift cards seem like a good idea. With Christmas coming up, I’m going to explain why they’re not always the best idea.

They’re less flexible than cash

If you spend £10 to buy a £10 gift card, all you have done is taken £10 of cash, which can be spent anywhere, and converted it into a sort of pseudo-currency that can only be used at one shop. You can’t use a gift voucher for John Lewis at M&S for example.

Whilst multi-retailer gift cards like Love2Shop and One4All exist, they still limit you to a small range of retailers. And you can usually only spend them at large chain stores, so your recipient won’t be able to spend them at a local, independent shops. Let’s face it, Amazon is likely to be around for a long time, but independent shops would probably appreciate your custom.

They can only be used to purchase things

This might seem obvious, but you can only use gift cards to buy more things. You can’t use gift cards to pay bills, or repay debt, for example. And I mean, you really can’t – if someone claims to be HM Revenue & Customs and asks you to pay your tax bill with iTunes Gift Cards, then it’s a scam.

For someone who may be drowning in credit card debt, receiving some money that they can use to pay that off may be more meaningful. At worst, you could end up spending your money on a gift card that can only be used to buy something at a shop where the cost of getting there is higher than the value of the card.

They could also be worthless. If you’re an Android phone user, then you’re not going to get much out of an iTunes gift card, for example. You could try a web site that exchanges gift cards, where you can sell an unwanted gift card for cash. However, you’ll probably get less than its value back, and obscure gift cards may not sell for much.

They expire

Most gift cards expire after 12 months. We’ve had this problem before; a relative bought our (now) seven-year-old a gift card for a well-known toy shop chain. As their birthday is close to Christmas, we saved it to buy a gift the following year, but by the time we came to use it, it had expired. Meanwhile, cash never expires.

If the retailer goes bust, they may become worthless

We’ve recently seen the demise of Wilko in the UK, and other large chain stores like Debenhams, Jessops, Comet, Woolworths, Burtons have all disappeared in recent years. Usually, when these companies go bankrupt and call in administrators, their gift cards immediately become worthless. At best, you can sign up as a creditor of the company in the hope that you may get a fraction of the value of the gift card back.

Some people have lost serious money because of this in the past. Debenhams used to offer a wedding list service, and so those that had people buy them Debenhams gift cards as wedding presents may have lost out on hundreds of pounds.

What to do instead

Buying presents can be tricky, and I don’t think anyone wants to buy something that’ll just end up listed on eBay on Boxing Day. But maybe have a conversation with the person who you are buying a gift for first. Surprises can be nice, but so can knowing that you’re getting a thing that you actually want for Christmas. Christmas lists for Santa needn’t just be for children; you could keep a list in a note-taking app, for example, so that if anyone asks you what you want, you can tell them straight-away.

Or you could just give people cash. If all you are doing is swapping the same amount of money for a card which is restricted to one retailer and expires, then you’re taking choice away from your recipient. With cash, your recipient could use that money for:

  • the weekly food shop
  • to pay off a credit card
  • buy something nice from a small independent shop.

An Amazon gift card won’t allow the recipient to do any of those things.

If you don’t want to put bank notes or coins in the post, you can send a cheque. Despite rumblings from the banking industry a few years ago, most banks will still let you send and receive cheques. Indeed, most banking apps will let you scan cheques, so you can scan them on Christmas Day without waiting for a branch to open. Alternatively, you could send an IOU in a card, and then do a BACS transfer on Christmas Day. That’s if you already know their bank details, of course.

When is it appropriate to send gift cards?

So, now that I have written this, you may be surprised to hear that I am planning to send gift cards to some relatives this Christmas. But this is only because said relatives have specifically asked for them. And that’s fine – you could ask for gift cards as a contribution to a big purchase, for example. Just be careful that you choose a retailer that isn’t at imminent risk of bankruptcy. Money Saving Expert News is usually a good place to get news about retailers that are, or are at risk of entering administration, and their policy on accepting gift cards.

You can also sometimes buy gift cards at a discount. My employer offers Pluxee as an employee benefit, which sells gift cards at a typical 4% discount – but sometimes more. M&S is 6.5%, which means that you can buy a £25 gift card for £23.38.

If your employer doesn’t offer something similar, but you have a mortgage, then Sprive is worth considering. With Sprive, the discounts are smaller (about 3%) but the money you save is taken off your mortgage. If you decide to sign up to Sprive, use my referral code ‘HTWH65PM’ to get an additional £5 off your mortgage.

If you’re buying Amazon gift cards, it’s worth checking your personalised promotions page (sponsored link). Sometimes, Amazon offers additional discounts available if you buy gift cards in bulk.

Christmas Day is three weeks today. If you haven’t already finished your Christmas shopping, maybe reach out to your gift recipients to find out what they want. Just be aware of the last posting days for gifts.

Mounting a USB hard drive on startup on Ubuntu Core

A photo of a Raspberry Pi 4 connected to a USB external hard drive

As you’ll be aware from my regular posts about it, I have a Raspberry Pi 4 running Ubuntu Core, which acts as a server for Home Assistant, Plex and Calibre-Web. Here’s how I’ve set it up to mount an external USB hard drive on boot up.

As it’s a Raspberry Pi, the operating system and binaries set on a microSD card, which in this case is a mere 16 GB. Whilst the me of 20 years ago would have been astounded at the concept of something so tiny holding so much data, 16 GB isn’t much nowadays. So, I have a 1 TB external USB hard drive for storing the media files for Plex and Calibre-Web.

Ubuntu Core doesn’t automatically mount USB storage devices on startup unless you tell it to, and the instructions for doing so are different when compared with a regular Linux distro.

There’s no fstab

Most Linux distros, including regular Ubuntu, include fstab for managing file systems and mounting devices. But Ubuntu Core is designed to be a lightweight distro to act as firmware for Internet of Things devices, and so it doesn’t include many tools that are common in other Linux distros. fstab is one such tool which is missing.

You can, of course, just mount a USB drive manually with the following:

sudo mkdir /media/data
sudo mount /dev/sda1 /media/data

But this won’t persist when the computer restarts. After a bit of searching, I found a solution on StackExchange; it’s for Ubuntu Core 16, but works on 22 as well.

How to tell systemd to mount your USB hard drive

It should go without saying that you should back up your system before doing any of this. If you make a mistake and systemd stops working, your device could become unbootable.

Firstly, you’ll need to run sudo blkid to list all of the file systems that Ubuntu Core can see. Find the one that starts with ‘/dev/sda1’ and make a note of the long hexadecimal string that comes after UUID – it’ll probably look something like ‘2435ba65-f000-234244ac’. Copy and save this, as this identifies your USB hard drive.

Next, you’ll need to create a text file. Ubuntu Core only seems to offer the Vi text editor, which I haven’t bothered to learn to use properly. My favoured text editor is nano, but it’s not available on Ubuntu Core. Therefore, my recommendation is to create a file on another device and FTP it across. The file should be called media-data.mount; it’s really important the file name matches the intended mount point. For example, if you’re instead planning to mount the USB hard drive to /mnt/files, this text file would need to be called mnt-files.mount.

Here’s the template for the file:

Description=Mount unit for data

What=/dev/disk/by-uuid/[Your UUID]


You’ll need to paste in the UUID for your USB hard drive where it says ‘[Your UUID]’. You’ll also need to match the file system type; I have my external USB hard drive formatted as ext4 for maximum compatibility with Linux, but yours may use ExFAT or NTFS.

This file needs to be saved to /etc/systemd/system/media-data.mount . You can either use vi to create and save this file directly or FTP it across and copy it over.

There are three further commands to run in turn:

sudo systemctl daemon-reload
sudo systemctl start media-data.mount
sudo systemctl enable media-data.mount

If you’ve done this correctly, then the next time you restart your device, your USB hard drive should mount automatically. If not, then you should receive some surprisingly helpful error messages explaining what you’ve done wrong.

There’s another guide at Wimpy’s World which has some additional detail and helped me get this working.

Playlist of the month: guitar-heavy indie rock

Screenshot of the guitar heavy indie rock playlist on Spotify

Now that I’m blogging regularly again, I’ve decided to start a new monthly feature where I post a playlist of 10 songs, all around a theme. With a few hours to go until the end of the month, here’s this month’s playlist.

These songs are all indie rock songs with a big guitar riffs, and are some of my favourite songs. If you want to listen along, here’s the Spotify playlist.

  • ‘Steve McQueen’ by The Automatic. This was the first single from this band’s second album, ‘This Is A Fix’. It starts with thumping guitars, and never lets up. Whilst The Automatic are best known for their debut single, ‘Monster’, this is my favourite.
  • ‘When We Wake Up’ by Asylums. I can’t quite remember how I came across this song but it’s great, and only has around 60,000 streams on Spotify so far. There’s a catchy chorus and powerful guitar riffs all of the way through.
  • ‘Praise Be’ by The Plea. Another less well-known band who I found out about because they supported Ash on tour. I really like this song and it’s surprising that it’s not more well known.
  • ‘Boulevard of Broken Dreams’ by Green Day. I’m sure people will argue whether this fits the theme, but I would argue that it’s one of their best songs from their best album.
  • ‘Ruby’ by Kaiser Chiefs. The first single from their second album, with strong guitars from the start and a catchy chorus.
  • ‘Nothing’ by A. Calling your band ‘A’ probably made sense in the days when people bought singles from high street record stores, but the move to digital platforms makes this song a little harder to find.
  • ‘Dirty Little Secret’ by The All-American Rejects. Another first single from a second album; whilst it’s not my favourite song by this band, it fits the theme.
  • ‘I Bet You Look Good On The Dancefloor’ by Arctic Monkeys. Their debut single, and their best, in my view. I know it’s a controversial opinion but none of their subsequent singles have been as good as this.
  • ‘Orpheus’ by Ash. I mentioned Ash earlier, and as they’re the band I’ve seen the most (three times) I should include one of theirs.
  • ‘Just A Day’ by Feeder. Originally a B-side to ‘Seven Days In The Sun’, this ended up being a single for a later Greatest Hits album, and is arguably among their best songs.

I’ll do another playlist of 10 songs next month. With it being December, no prizes for guessing the theme.

Ultra Processed Food

Cover images of the books about ultra-processed food mentioned in the article

Something that I’ve become more concerned about in our household is our consumption of so-called ‘ultra-processed food’. My wife has had a few health issues over the past 18 months, including an elevated risk of developing type two diabetes which has seen her cut her sugar intake. But this coincided with the publishing of several books related to ultra-processed food, and has seen us made some changes to reduce our exposure to them.

The books

Before I go into much detail, here are the books I’m talking about:

  1. Ultra-Processed People by Dr Chris van Tulleken
  2. The Way We Eat Now by Dr Bee Wilson
  3. Ravenous by Henry Dimbleby

Note: these are sponsored links, but feel free to purchase these books from your local independent tax-paying bookshop, or borrow them from a library.

If you only read one of these, read Chris van Tulleken’s Ultra-Processed People. Chris is probably better known as ‘Dr Chris’ from the CBBC show Operation Ouch, which he presents with his twin brother Dr Xand (and later Dr Ronx). He’s a triple-threat: a GP who is also a research scientist and a TV presenter, and it shows. He’s able to digest some academic research into an easily readable format, which isn’t surprising when you consider that this is what he does for his patients and his TV audience. But it also means that there’s academic rigour behind this book.

Dr Xand pops up quite a bit in this book; Chris and Xand are identical twins but have different physiques. Chris puts this down to Xand’s time in the USA, where he was exposed to higher amounts of so-called ‘ultra-processed food’, and so he’s ended up higher on the BMI scale than his brother (although Chris acknowledges that BMI is discredited). When they both contracted Covid-19 in 2020, Xand was more seriously ill than Chris.

Over the course of the book, we discover that there’s increasing evidence that ultra-processed food is linked to obesity, and how the food industry tries to downplay it.

How do we define ultra-processed food?

Chris acknowledges that it can be hard to define what ultra-processed food is. The best model that we have is the Nova classification, developed by Prof Carlos Augusto Monteiro at the University of Sao Paulo in Brazil. Essentially, this splits food into 4 groups:

  • Nova 1 – wholefoods, like fruit, vegetables, nuts etc that can be eaten with little to no processing
  • Nova 2 – culinaries, like vinegar, oils, butter and salt that require some processing
  • Nova 3 – processed food. This is basically anything that’s been cooked, so home-made bread would fall under here. Foods from the Nova 1 and 2 categories are combined to create the foods in the Nova 3 category.
  • Nova 4 – ultra-processed food, which is made from formulations of food additives that may not include any ingredients from the Nova 1 category.

Probably the easiest way to work out if something fits into the Nova 4 category is by looking at the list of ingredients. If there are one or more ingredients listed that you can’t expect to find at a typical large supermarket, then it’s probably ultra-processed food. Things like emulsifiers, artificial sweeteners, preservatives and ingredients identified only using those dreaded E numbers that my mum used to be wary of back in the 1980s.

And there’s a lot of food that fall into the Nova 4 category. Almost all breakfast cereals, and any bread that is made in a factory, are examples.

Why are ultra-processed foods so common?

Fundamentally it’s to do with cost and distribution. For example, a tin of tomatoes that contains some additional ultra-processed ingredients may be cheaper than a tin just containing tomatoes (and perhaps a small amount of acidity regulator). It’s a bit like how drug dealers cut drugs with, for example, flour, to make more money when selling their drugs on.

Distribution is also a factor. A loaf of bread that is baked in a factory may take a couple of days to reach supermarket shelves, where it also needs to be ‘fresh’ for a few days. So the manufacturers will add various preservatives and ingredients to ensure that bread remains soft.

You can bake your own bread using only yeast, flour, salt, olive oil and water. But Tesco will sell you a loaf of Hovis white bread that also contains ‘Soya Flour, Preservative: E282, Emulsifiers: E477e, E471, E481, Rapeseed Oil, and Flour Treatment Agent: Ascorbic Acid’. These are to keep the bread soft and extend its shelf life, as a homemade loaf may start going stale after 2-3 days. This means that a shop-bought loaf may go mouldy before it goes stale.

Other common examples

Breakfast cereals brand themselves as a healthy start to the day, but often contain worryingly-high amounts of sugar. And there’s evidence that their over-use of ultra-processed ingredients interferes with the body’s ability to regulate appetite, leading to over-eating.

Ice cream is also often ultra-processed, if you buy it in a supermarket. The extra additives ensure that it can survive being stored at varying temperatures whilst in transit. It’s notable that most UK ice cream is manufactured by just two companies – Froneri (Nestlé, Cadbury’s, Kelly’s, Häagen-Dasz and Mövenpick brands) and Unilever (Walls and Ben & Jerry’s). There are many small ice cream producers, but the challenge of transporting ice cream and keeping it at the right temperature means that they have limited reach.

I’m also worried about a lot of newer ‘plant-based’ foods that are designed to have the same taste and texture as meat and dairy products. You can eat a very healthy plant-based diet, but I would argue that some ultra-processed plant-based foods would be less healthy that the meat and dairy products that they’re mimicking.

What we’re doing to cut our intake of ultra-processed food

We now bake our own bread in a bread machine. Not only do you avoid ultra-processed ingredients, but freshly-baked bread tastes so much nicer than a loaf bought in a shop. It takes a little more planning, but most of the ingredients don’t need to be kept fresh.

We also buy more premium products where we can. Rather than refined vegetable oils, we buy cold-pressed oil for frying, and I’ve mentioned chopped tomatoes above. Of course, these products cost more, and it’s something that both Chris and Henry mention in their books. It should come as no surprise that there’s a link between obesity and poverty, if people on low incomes cannot afford good food.

And we’ve had to give up Pringles. Chris devotes practically a whole chapter to them, and how they trick the brain into wanting more.

You can download the Open Food Facts app to help decipher food labels. It includes a barcode scanner, and will warn you if what you’ve scanned is ultra-processed food. The good news is that there are still plenty of convenience foods which are not ultra-processed – there’s some suggestions in this Guardian article.

Whilst I haven’t yet given up on artificially-sweetened soft drinks, we reckon that we’ve cut our sugar intake and our exposure to artificial additives. In many cases, we don’t know the long-term health effects of these additives, although we do know that some people struggle to lose weight despite eating a supposedly ‘healthy’ diet and exercising regularly.

Home Assistant with HTTPS and HomeKit

A screenshot of Home Assistant running in a web browser with HTTPS enabled and no certificate errors

Welcome to the latest chapter of getting Home Assistant working on a Raspberry Pi using Docker. Last time, I’d managed to get it working in Docker, but only over a regular HTTP connection and without HomeKit. The good news is that I’ve solved both of these problems.

Using SWAG to enable HTTPS

Firstly, I recommend reading this paragraph whilst listening to ‘Swagger Jagger’ by Cher Lloyd.

I’ve tried lots of different ways to get Home Assistant working over SSL/TLS. There’s a good reason why this is one of the key selling points of Home Assistant Cloud, as it can be difficult. Thankfully, there’s a Docker image called SWAG (Secure Web Application Gateway) that handles much of the legwork. Once you’ve installed SWAG, follow this guide, and you should find that you can access your Home Assistant setup at https://homeassistant.[yourusername] . No need to specify a port, or accept any certificate warnings.

Inside SWAG, there’s a DNS client, which will automatically renew the SSL certificates every 90 days for you, using ZeroSSL or Let’s Encrypt. There’s also nginx, which is used to set up a reverse proxy, and support for dynamic DNS services like DuckDNS.

SWAG has sample configurations for lots of different services, including calibre-web, so I have SSL access to my calibre-web image too. My only issues with it so far were last week when DuckDNS went down on Sunday morning. Most services, like Home Assistant, need to be mounted as subdomains (as above), but others (like calibre-web) can be mounted as subfolders, e.g. https://[yourusername] This reduces the number of subdomains that you need SSL certificates for; ZeroSSL only offers 3 subdomains for a free account so it’s worth considering subfolders if you want to add more services.

If you have your own domain, then you can also add a CNAME to it to point it at your DuckDNS account, should you wish to use that rather than a [something] address.

Getting Apple HomeKit working

Carrying on the musical theme, here’s ‘Carry Me Home’ by Gloworm, a 90s dance classic which has only recently become available on digital platforms again.

After getting my swagger jagger on and getting HTTPS working, the final issue I’ve been having with Home Assistant is the HomeKit bridge. Adding Home Assistant devices to Apple’s Home app is something that normally works out of the box if you install Home Assistant OS, but takes more work if you use Docker.

The instructions which helped me where these on the Home Assistant forums. You’re going to need to install another Docker image containing avahi; there are several but this one worked for me. It’s bang up to date, unlike the most common Docker image which is, um, 8 years out of date and also only works on x86 machines. Which isn’t much help for my arm64-based Raspberry Pi 4.

Once you’ve installed avahi, added the relevant lines to configuration.yaml in Home Assistant and restarted it, HomeKit should work. To get started, add the HomeKit integration to Home Assistant – you may want to specify which devices will show if you don’t want all of them. Then, use your iPhone or iPad to scan the QR code in your Home Assistant notification panel, and add the bridge. If all goes well, it should immediately tell you that it’s an unsigned device, but will then let you set up each device in turn.

If it just sits there for several minutes and then gives up, you’ll need to do some more digging. Don’t worry, this happened to me too. I suggest downloading the Discovery app, which shows all of the mDNS devices broadcasting on your network. If you can’t see ‘_hap._tcp’ in the list, then there’s a problem. In my case, this turned out to be because my Raspberry Pi wasn’t connected to the same wifi network. It’s plugged in to my ADSL router with a network cable, but we use Google Wifi which results in a ‘double NAT’ situation. Connecting the Raspberry Pi to both wired and wireless connections seemed to fix the issue.

Indeed, as a side effect Home Assistant managed to autodiscover some additional devices on my network, which was nice.

Home Assistant Core in Docker? Done it, mate

All in all, I’ve successfully managed to get Home Assistant to where I want it to be – self-updating in Docker, secure remote access, and a HomeKit bridge so that I can ask Siri to manage my devices. I’m looking forward to being able to turn my heating on whilst driving, for example.

It’s been a challenge, requiring a lot of skimming through the Home Assistant forums and various StackExchange discussions. Ideally, I would have a spare computer to run Home Assistant OS, which would have taken some of the leg work out of this, but I’m happy with the setup. Finding SWAG and getting it to work was a moment of joy, after all the setbacks I’d had before.

Using Portainer to manage Docker

Screenshot of the Portainer web interface

So you may have noticed that I have a thing going on with Docker at present. I’ve set up Home Assistant in Docker, and more recently also set up calibre-web with Docker. Between these, and other Docker images, it’s quite a lot to manage – especially on a headless remote device. Thankfully, Portainer is a web-based solution to managing multiple Docker containers.

There’s a free community edition which offers sufficient features to manage one Docker system, which I’m using. If you need to manage multiple systems, there’s a Business Edition available that you need to pay for, but home users should get by with the Community Edition. Although you will see lots of greyed out options which are only available in the Business Edition – something anyone who uses a freemium WordPress plugin will recognise.

The installation instructions are detailed, and there are a number of steps that you’ll need to follow using the command line. Once everything’s set up, you’ll be able to open a web browser and see all of your Docker containers, and their status.

Portainer lets you start, stop and restart containers from the web interface, and delete any containers no longer needed. The feature that I’ve found most useful is the ‘Duplicate/Edit’ function, which allows you to easily duplicate a container, and optionally replace the original container with a new one with updated variables. This is great for people like me who invariably make a mistake when setting up a Docker Compose file. Logs are also made easily accessible, which helped me when troubleshooting a container that was starting but then wasn’t accessible through a web browser.

You can also run new containers in Portainer; whilst this is easier than typing out commands, Docker Compose works better for me as you can just copy and paste them.

If you’ve got a few Docker images up and running, I would recommend Portainer as an easier way of managing them. It’s much nicer than having to type out commands in a ssh session, and is a friendlier way of working with Docker for less experienced users, like myself.

Managing e-books with Calibre-web

Screenshot of the calibre-web interface

If, like me, you’ve picked up a number of e-books over the years, you may use Calibre as your e-book manager. It’s a desktop application with an optional web interface, but it has its drawbacks. The user interface is clunky, and it tries to cram lots of advanced features in – even the latest version 7 is overwhelming for new users. So, if you can forego the desktop application, there’s an alternative called calibre-web that does the same thing in a web browser, and with a much nicer interface.

Once installed, you can migrate your existing metadata.db from Calibre and the e-book folders, and calibre-web will pick up where you left off. I particularly like the ability to download metadata from sources such as Google Books, to get more complete data about each book besides its author and title. There’s a built-in e-reader, or you can use an app that supports OPDS – I used Aldiko.

By far the easiest way to install it is using Docker. There’s a good image on DockerHub; it’s maintained by a third-party but recommended by calibre-web’s developers. Once installed, it doesn’t require much additional configuration.

By default, calibre-web doesn’t allow uploads, but you can amend this in the Admin settings. The settings toggle s rather buried away, and it took me some time to find. But once uploads are enabled, it allows you to completely replace the desktop Calibre app if you want to. You can also set up multiple user accounts, if you want to share your calibre-web server with others.

I have calibre-web installed on the same Raspberry Pi as my Plex and Home Assistant servers. Indeed, calibre-web essentially offers a kind-of Plex for e-books, seeing as Plex doesn’t offer this itself. Unfortunately, most of my e-books were purchased through Amazon, and so only accessible through their Kindle apps and devices. But for the handful of books that I’ve picked up through the likes of Unbound and Humble Bundle, it’s helpful to have them in one place.

Comment Spam strikes back

An illustration of a robot turning web pages into canned meat product. Generated using Bing AI Image Generator

So now that I’m blogging again, it’s the return of comment spam on my blog posts.

Comment spam has always been a problem with blogs – ever since blogs first allowed comments, spam has followed. Despite the advert of the rel=”nofollow” link attribute, automated bots still crawl web sites and submit comments with links in the hope that this will boost the rankings in search engines.

In the early days of blogging, blogs often appeared high in Google’s search engine results – by their very nature, they featured lots of links, were updated frequently, and the blogging tools of the time often produced simple HTML which was easily parsed by crawlers. So it was only natural that those wanting to manipulate search engine rankings would try to take advantage of this.

I’ve always used Akismet for spam protection, even before I switched to WordPress, and it does a pretty good job. Even then, I currently have all comments set to be manually approved by me, and last week a few got through Akismet that I had to manually junk.

Humans, or AI?

These five interested me because they were more than just the usual generic platitudes about this being a ‘great post’ and ‘taught me so much about this topic’. They were all questions about the topic of the blog post in question, with unique names. However, as they all came through together, and had the same link in them, it was clear that they were spam – advertising a university in Indonesia, as it happens.

Had it not been for the prominent spam link and the fact they all came in together, I may have not picked up on them being spam. Either they were actually written by a human, or someone is harnessing an AI to write comment spam posts now. If it’s the latter, then I wonder how much that’s costing. As many will know already, AI requires a huge amount of processing power and whilst some services are offering free and low cost tools, I can’t see this lasting much longer as the costs add up. But it could also just be someone being paid using services like Amazon Mechanical Turk, even though such tasks are almost certainly against their terms of service.

I think I’m a little frustrated that comment spam is still a problem even after a few years’ break from blogging. But then email spam is a problem that we still haven’t got a fix for, despite tools like SPF, DKIM and DMARC. I’m guessing people still do it because, in some small way, it does work?