Skip to main content

Posts

Setting up a new user in Ubuntu from scratch

Adding new users to Ubuntu is easy because of the convenience tools that exist. Start with the command sudo useradd -d /home/testuser -m testuser This creates the user and sets up a default home directory.  The user doesn't have a password, but you could add one with passwd if you wanted to. Then create a directory .ssh in their home directory.  Create a file called authorized_keys in the directory and copy in contents of the users public key into it. Chown the .ssh directory (and file) to the user and chmod the file to 600.  The directory should be mode 700. Make sure that /etc/sshd_config is set up to deny logging in by password. If you want to set up their bash profile you can copy the ".profile" and ".bashrc" files to their home directory.  Remember to edit /etc/passwd and set their shell to bash. The user should be able to login using their public key by setting up their .ssh/config on their home machine. Host foo HostName server.ip.addres...

Fixing missing msvcp110.dll in xampp on Windows

Microsoft Happiness I need to use a Windows server to deploy a program I'm busy writing. I kept on getting a problem with Apache starting up that prevented it from including the MS sql server pdo drivers. The message was: the program can't start because msvcp110.dll is missing from your computer. Try reinstalling the program to fix the problem and this usually relates to the Visual C++ Redistributable for Visual Studio 2012 package not being installed. I had previously installed this (it's available on the Microsoft site ) but was still getting the error. Eventually I stumbled on this topic on the Apache friends forum which advised to copy the msvcp110.dll file to both the \xampp\apache\bin and the \xampp\php directories. Apparently Apache wasn't able to follow the OS path to find the file.  In my case it was already in the php directory but not with the other binaries for Apache. After copying it there Apache restarted without errors and the PDO dr...

Fixing broken thumbnails in Drupal

If your autoscaled images are not loading on Drupal then here are some steps to take to troubleshoot the problem. Firstly, if you have logging enabled then visit http://yoursite.com/admin/reports/dblog to see a list of the events. If you see events with the message "Unable to generate the derived image located at...." then check the permissions on your files directory.  This is usually /sites/default/files. Then check that the GD library is installed for PHP.  On Ubuntu you can install it with apt-get install php5-gd If you don't see any events then try opening the image url in a new tab on your browser to confirm that your web server is properly rewriting the request to index.php instead of trying to serve a file out of the directory. On Nginx you should consider using location blocks like this: # These next locations help with thumbnails - https://www.drupal.org/node/2374961 location @rewrite { rewrite ^/(.*)$ /index.php?q=$1 last; } ...

Setting up Nginx as a reverse proxy for Apache with SSL termination

Reverse Proxy diagram from Wiki Commons We're currently hosting client sites on a Rackspace server and using their Load Balancer feature as a way to terminate SSL and not have to have multisite certificates. We only attach one node to the Load Balancer so we're paying for more than we're using.  My proof of concept is to use Nginx to terminate SSL certificates and proxy to the Apache server.  This will save us £ 225 per load balancer, and since we're using ten of them that's quite a significant saving. My first step was to spin up a free tier EC2 instance running Ubuntu 14.04 LTS.  I guess you can replace this with your favourite cloud or on-the-metal server. Then I installed my packages. These are the ones I remember so YMMV. sudo apt-get install nginx apache2 fail2ban php5-fpm mcrypt php5-mcrypt openssl php5-cli php5 libapache2-mod-php My network diagram is slightly different from the picture for this post in that the web server is hosted on the ...

Securing Jenkins with oAuth

Jenkins is pretty easy to secure with the help of some useful plugins. The first that I suggest using is an oAuth provider.  Our repositories are hosted on Bitbucket so I'm using their oAuth, but there is also a Github oAuth plugin.  The instructions to set up the plugin are very clear (see the plugin page ). When you're configuring your Jenkins to use the oAuth security remember to leave the Authorization setting to "logged in users can do anything" for now.  We'll change this later, but we don't want to get locked out of Jenkins when we apply the security settings. Now install the plugin Role Based Authentication Strategy (see the plugin page ). Add a new group called "Anonymous" and uncheck everything. When a user logs into the oAuth they'll be given a message by Jenkins saying that they don't have any permissions.  This means that not everybody with a Bitbucket account can access your site so thats a good thing. You just need ...

Checking the SSL certificates for a list of domains

We have a number of domains that are secured by SSL and need to be able to automate checks for certificate validity and expiry. Luckily there is a script to do exactly this.  On Ubuntu you can apt-get-install ssl-cert-check but there are copies of the script online  in case your distro doesn't have it as a package. Create a file with a list of the domains you want to check and the port to check on.  It should look something like this: yourdomain.com 443 www.anotherdomain.com 443 www.yetanotherclientdomain.com 443 Lets assume you called your file domainlist.txt You can then run ssl-cert-check -f domainlist.txt to get the tool to run through it and display to console the status and expiry date of the domains you provided. Using the options shown in the help page for the script lets you use the script to send an email to you if a certificate is going to expire soon. ssl-cert-check -a -f domainlist.txt -q -x 30 -e yourmail@foo.com If you get a messa...

Allowing the whole wide world to read your S3 bucket

This is a bucket policy that you can use to allow the whole world the ability to read files in your s3 bucket. You might want to do this if you're serving static web content from S3 and don't need the fine grained control that the Amazon documentation details. You will still need to set up permissions on the bucket but this policy will let people read the files you're storing on S3.