Installing Free SSL on Apache Webserver

Disclaimer: I haven’t learnt much about HTTPS so there may be some misused terms here and there.

My personal website has been up on my DigitalOcean server for about a year now. I access the website almost everyday to update my expense manager. However, I never installed SSL on the server so all of my financial activities have been going around and back in plain texts!

So I decided to secure and authenticate my server using HTTPS. However, most of trusted SSL certificate which are signed by Certificate Authority (CA) are not free (this is one of the reason I hadn’t installed any SSL certificate up until just earlier).

Then I discovered Let’s Encrypt, a free, automated, and open CA. They had me at “free, automated”.

So I started setting up my server to adopt the certificate. My website is Ruby (2.3.0) on Rails (4.2.6) on Passenger (5.0.29) on Apache (2.4.7) on Ubuntu (14.04 32 bit) (phew). First, clone Let’s Encrypt from GitHub (OPEN!).

sudo git clone /opt/letsencrypt && cd /opt/letsencrypt

Then the automatic part. This command will create a new certificate for the domain you provided and set up Apache automatically.


You can add multiple domains/aliases/subdomains. I haven’t tried, but you probably can just use *.YOURDOMAIN.COM for that. I ran the command for,, and After that, you’ll be asked to provide an email address (you know, just in case) and choose whether you still allow HTTP or force all traffics to be HTTPS. For me, it didn’t matter because I ended up changing the Apache config file later.

The command above will create a new Apache configuration file in /etc/apache2/sites-available/000-default-le-ssl.conf (I forgot the exact filename, something like that). However, because I already set up Apache to use my own configuration file, I had to modify it to adapt the certificate.

I think it’s best to redirect all your traffics to HTTPS (as long as there’s no need to have HTTP connections). So, I modified my conf file by changing the virtual host port from 80 to 443. This way, the HTTPS requests will connect to the previously set up virtual host. However, HTTP requests need to be redirected. So I added new virtual host:

<VirtualHost *:80>
Redirect permanent /

Finished. Every request to my website will be in HTTPS. But wait… No SSL certificate had been installed yet, because Let’s Encrypt put it in another configuration file. So, I opened up that configuration file and copied some of the contents to my configuration file. Here are the lines that I copied:


And I added

SSLEngine on

Before those lines.

I did the same (redirection and ssl copy) to the other virtual host for

After finishing setting up the configuration file, then I set up Apache to use it back (because Let’s Encrypt made it use the auto-generated configuration file). I also made sure to enable SSL mod too.

sudo a2enmod ssl
sudo a2dissite 000-default-le-ssl.conf
sudo a2ensite MY_CONF_FILE_NAME.conf
sudo service apache2 restart

FINISHED. However, note that Let’s Encrypt certificate expires within 90 days. Way out of this is to renew the certificate regularly (of course, before the expiry date). This can be done automatically using cron job (which I will write in this blog if I have the time some time in the future).




How I Organise My Photos

I’m a sucker for cloud storage.

I have unlimited storage in Google Drive and I put everything up there in the clouds. Files that consume most of my Google Drive space (more than 1 TB as of now) are photos. I take LOTS of photos. I store the RAW files from my camera in Google Drive too.

I’m a bit obsessed about how my photos are organised too, sadly.

Currently, I’m using both Dropbox and Google Drive to organise my photos. I use Dropbox to upload the photos from my phone automatically. I use Dropbox because it converts the filenames to a format I like, unlike Google Photos (which retains the original filename). However, I still upload my photos–albeit being in ‘High Quality’ size–to Google Photos because it has so many awesome features.

Because I only have 16 GB of Dropbox space, I regularly move the photos to my Google Drive. Not only being in the right cloud, I also want my photos to be organised in such way it’s easy for me to find (and reminisce about) them. So here’s how I do it.


For camera photos from my phone and tablet, I use Dropbox’s Camera Upload feature to upload the camera results to Dropbox using WiFi. The photos will then be downloaded to my computer using the Dropbox desktop app.

For DSLR photos, I upload them directly from my computer to Google Drive using Insync.

For other types of photos, I either upload them directly from my computer, or use BitTorrent Sync to upload them from my phone/tablet to my computer (and subsequently be uploaded to Google Drive).


Here’s the best part.

With a little magic from my programming knowledge I automatically organise all of my photos collection. One thing: I like the filename format provided by Dropbox:

/20[0-9]{2}-[0-9]{2}-[0-9]{2} [0-9]{2}.[0-9]{2}.[0-9]{2}/

It basically names the photo based on its creation date. For example: “2016-07-24 01.28.25.jpg“. So, I want all my photos to be named in that format. Aside from being in that format, I also want them to be organised in directories based on the year and month. So the photo with the filename previously mentioned would be placed in “$SOME_DIR/2016/07“.

Photos from phone/tablet are automatically named as such. However, they’re uploaded to Dropbox! So I created a Ruby script (which runs on my local machine) to move them to the destination directory (which syncs to Google Drive).

require 'fileutils'

def move_file(new_dir, new_file, f)
  unless File.exist?(new_file)
    FileUtils.mkdir_p(new_dir), new_file)

fn ="/home/araishikeiwai/$DROPBOX_CAMERA_UPLOADS")
fn.each do |f|
  if File.file?(f)
    year = f[0..3]
    month = f[5..6]
    new_dir = "/home/araishikeiwai/$GOOGLE_DRIVE_CAMERA_DIR/#{year}/#{month}/"
    new_file = new_dir + f
    move_file(new_dir, new_file, f)

I run the script every two hours using Cron. This way, I always have them organised by year and month on Google Drive. Further, if there are certain events, I move the photos for that events and put them in a directory in root photo directory, based on the structure:


For DSLR photos, I don’t always convert them all to JPG. Most of the photos from my DSLR camera are RAW. I only convert what I need to share or post on Instagram/500px/Facebook. I use Darktable for my post-processing needs and set up the export filename format as above. And when there are many photos from certain events, I follow the rule above.

Other photos from phone apps, such as Instagram, I created a script to automatically move them and rename them to the same format.

One interesting thing to note is screenshots. I organise all my screenshots in one directory. So the screenshots from my computer (taken using Shutter), phone (QuickMemo+), and tablet (standard iOS screenshot) are all moved and renamed (following the same format, of course) into the same directory. Again, based on year and month.

This seems tedious, with heavy use of internet (downloading from Dropbox and uploading back to Google Drive) and only works when my computer is running (although my computer is almost always running). I tried creating a Google Apps Script to download the photos from Dropbox and upload them directly to Google Drive (bypassing my local computer), however there’s a 10 MB limit on file size, which can be a problem when I have videos to be moved. I’ll create another post on how I utilise Google Apps Script and Dropbox API to migrate files automatically between them.