27 January 2018

Setting up Raspbian OS full unattended updates

Just a quick note to self for the future. The instructions in this post will set-up automatic update of all packages (as opposed to security-only). It has been tested on Raspbian Stretch.
sudo apt-get install unattended-upgrades

As root, edit /etc/apt/apt.conf.d/50unattended-upgrades and add the lines below to the Unattended-Upgrade::Origins-Pattern section:

Unattended-Upgrade::Origins-Pattern {
 "origin=Raspbian,codename=${distro_codename},label=Raspbian";        "origin=Raspberry Pi Foundation,archive=stable";
};

Now run the following command to test it:
sudo unattended-upgrades -d

Also consider changing the following lines in 50unattended-upgrades, which includes removing unused packages and rebooting automatically regardless of logged in users at a set time if needed, e.g. after a kernel upgrade:
Unattended-Upgrade::Remove-Unused-Dependencies "true";
Unattended-Upgrade::Automatic-Reboot "true";
Unattended-Upgrade::Automatic-Reboot-WithUsers "true";
Unattended-Upgrade::Automatic-Reboot-Time "06:30";

To ensure that the service is enabled and operational run:
sudo systemctl status unattended-upgrades


07 September 2017

Building a local Dockerised Oracle 12.1 SE2 development database with sample schemas and data

Just a quick note on how to quickly build a Dockerised local Oracle development database, which contains the Oracle-provided sample schemas and data; i.e. the HR schema and friends. Oracle has for some time provided docker build scripts for a variety of their products on github. Oracle has also provided the samples schema for their databases on github.

Requirements/assumptions:
  • You're running Docker on Windows; tested with Windows 10 Pro and Docker CE via Hyper-V
  • You have installed Ubuntu Bash on Windows; tested with Windows build 15063 running Ubuntu 16.04
  • You have enabled Docker option: Expose Daemon on tcp://localhost:2375 without TLS
Start Ubuntu Bash and run:
sudo apt install docker.io wget
echo 'alias /usr/bin/docker="docker -H=localhost:2375"' >>~/.bash_aliases
exit
# Start Ubuntu bash again and run:
mkdir -p ~/dev/git/oracle
cd ~/dev/git/oracle
# Clone Oracle Docker repo
git clone https://github.com/oracle/docker-images.git
Download the two files linuxamd64_12102_database_se2_1of2.zip and linuxamd64_12102_database_se2_2of2.zip from the page and heading (12.1.0.2.0) - Standard Edition (SE2) and put them in the docker-images/OracleDatabase/dockerfiles/12.1.0.2 repository directory. Now run the following commands from Ubuntu bash to build the docker image:
cd ~/dev/git/oracle/docker-images/OracleDatabase/dockerfiles
# Build docker image for 12.1.0.2 SE2 edition (takes a while)
./buildDockerImage.sh -v 12.1.0.2 -s

docker images
REPOSITORY          TAG                 IMAGE ID            CREATED             SIZE
oracle/database     12.1.0.2-se2        dc825c15ed24        28 minutes ago      10.3GB
oraclelinux         7-slim              c0feb50f7527        4 weeks ago         118MB

Run the next commands to create a running docker container with the database (takes a while):
docker run --name orclcdb_12102se2 -p 1521:1521 -p 5500:5500 \
 -e ORACLE_SID=orclcdb -e ORACLE_PDB=orclpdb1 -e ORACLE_PWD=mysecret42 \
 -e ORACLE_CHARACTERSET=AL32UTF8 oracle/database:12.1.0.2-se2
Now download and install the Oracle sample schemas:
cd ~/dev/git/oracle/
wget https://github.com/oracle/db-sample-schemas/archive/v12.1.0.2.zip
unzip v12.1.0.2.zip
cd db-sample-schemas-12.1.0.2/
perl -p -i.bak -e 's#__SUB__CWD__#'/opt/oracle/db-sample-schemas-12.1.0.2'#g' *.sql */*.sql */*.dat
cd ..
# This copies the sample schemas inside the container (the alternative would have been to mount a docker volume on container creation/start:
docker cp db-sample-schemas-12.1.0.2/ orclcdb_12102se2:/opt/oracle
# Start a bash shell inside the container
docker exec -t -i orclcdb_12102se2 /bin/bash
cd /opt/oracle/db-sample-schemas-12.1.0.2
mkdir log
sqlplus system/mysecret42@//localhost:1521/orclpdb1
@mksample mysecret42 mysecret42 mysecret42 mysecret42 mysecret42 mysecret42 mysecret42 mysecret42 users temp /opt/oracle/db-sample-schemas-12.1.0.2/log orclpdb1
And that's it - the docker container's Oracle database should now be populated with the Oracle sample schemas and data.
Note that per default a Container database with a Pluggable database is built (even for Standard Edition 2). However, this can be changed to build just a standalone database by editing the ~/dev/git/oracle/docker-images/OracleDatabase/dockerfiles/12.1.0.2/dbca.rsp.tmpl file and changing the option CREATEASCONTAINERDATABASE from true to false. Then re-run buildDockerImage.sh to build a new image.

24 August 2017

Retrieving stack traces/backtraces from Apache/PHP crashes on RedHat Linux

This post is effectively an amalgamation of 3 documents:
The notes above deal with pieces of the puzzle to generate stack traces/backtraces of crashing PHP code; and this note effectively takes the more holistic view to getting it done.

Evidence that crashes occur is typically located in the /var/log/httpd/error_log* files - e.g. entries like:

[timestamp] [core:notice] [pid 7528] AH00052: child pid 11172 exit signal Segmentation fault (11)
[timestamp] [core:notice] [pid 7528] AH00052: child pid 11063 exit signal Segmentation fault (11)

In summary, it is possible to retrieve stack traces of PHP code crashes that cause the Apache process to terminate (e.g. Segmentation fault). This is achieved with the GNU Debugger (gdb). Note that the one piece of the puzzle that is not covered is the requirement for the yum-utils package, which contains the debuginfo-install command.

To generate the stack trace we use gdb to start/run httpd in debug mode (-X).

First things first, we need to install various packages - please do so only on a desktop or dev/test server that can suffer a period of outage:

sudo yum install gdb yum-utils kernel-debuginfo kernel-debuginfo-common

When you first run gdb it will tell you if it has the requisite debug information - and if not it will give you instructions to install:

In our case, when running "gdb /sbin/httpd" as per below we were notified to run:


sudo debuginfo-install httpd-2.4.6-67.el7_4.2.x86_64

We exit'ed gdb (q + ENTER), installed above and then ran gdb again as per below.


sudo -s

# Stop apache (e.g. on a dev/test environment)
service httpd stop

# Run gdb
gdb /sbin/httpd

GNU gdb (GDB) Red Hat Enterprise Linux 7.6.1-100.el7
Copyright (C) 2013 Free Software Foundation, Inc.
License GPLv3+: GNU GPL version 3 or later
This is free software: you are free to change and redistribute it.
There is NO WARRANTY, to the extent permitted by law.  Type "show copying"
and "show warranty" for details.
This GDB was configured as "x86_64-redhat-linux-gnu".
For bug reporting instructions, please see:
...
Reading symbols from /usr/sbin/httpd...Reading symbols from /usr/lib/debug/usr/sbin/httpd.debug...done.
done.

(gdb) run -X
Starting program: /sbin/httpd -X
[Thread debugging using libthread_db enabled]
Using host libthread_db library "/lib64/libthread_db.so.1".
Detaching after fork from child process 10363.
Detaching after fork from child process 10364.
Detaching after fork from child process 10365.

# Now run the request through the client to the Apache server, which causes the crash:


Program received signal SIGSEGV, Segmentation fault.
0x00007fffe8b36c49 in _zval_ptr_dtor () from /usr/local/zend/lib/apache2/libphp7.so
 

# Produce the backtrace:
(gdb) bt
#0  0x00007fffe8b36c49 in _zval_ptr_dtor () from /usr/local/zend/lib/apache2/libphp7.so
#1  0x00007fffd9ada5b1 in php_oci_bind_hash_dtor () from /usr/local/zend/lib/php_extensions/oci8.so
#2  0x00007fffe8b57bb2 in zend_hash_destroy () from /usr/local/zend/lib/apache2/libphp7.so
#3  0x00007fffd9ae552b in php_oci_statement_free () from /usr/local/zend/lib/php_extensions/oci8.so

...

As can be seen above, the culprit is the function zval_ptr_dtor, which is called from the OCI driver/interface (Oracle Call Interface) and is related to freeing up the OCI statement.

If you have support with e.g. Zend/Roguewave it is now possible to raise a support request with them and include the stack trace.

Alternatively, one could do a search on the crashing function via https://bugs.php.net for any outstanding bugs and/or file a new bug request.


25 May 2017

Studying for the AWS Certified Solution Architect Professional Exam - One Approach

After I announced on LinkedIn that I had passed the AWS CSA Professional exam I received a number of questions from others asking me how and what to study.  Instead of responding to individual requests I therefore decided to write this post to explain how I studied for the exam; and what I studied.

Not to detract from the fantastic work that the A Cloud Guru team are doing, but doing the CSA Professional course alone is highly unlikely to get you a pass (and Ryan does state this in his course material in no uncertain terms!) I personally think that a lot more studies are needed. Hence this blog post...

First of all, I have somewhere between 3 and 4 years of AWS experience gained from various proofs-of-concept back in 2010-2012, working in a start-up between 2013-2016 and now consulting and working with a variety of businesses in the DevOps and Solution Architecture space.

I spent 5-6 weeks studying. As I started my studies I booked and therefore locked in the date of the exam - I find this a fantastic motivator to keep going with the studies. I find it hard to study for something if I can't see the goal post; there are so many distractions and tactical de-tours in life! ;-)

To get an idea of what I was facing I started out with A Cloud Guru's CSA Professional course and watched all the videos over approximately a week.

I spent another approximately two weeks reading the (currently 6) recommended white papers as per the certification blueprint and a few other white papers too:
  1. AWS Best Practices for DDoS Resiliency
  2. Using Amazon Web Services for Disaster Recovery
  3. Securing Data at Rest with Encryption
  4. AWS Security Best Practices
At this point I decided to go for Gold and took the Practice exam. I didn't do fantastically well with a 62% score and it was a call to arms for me in terms of realising that more studies were warranted (although note that the practice exam isn't great - see later).

I spent the final 2 weeks reading & skimming the actual AWS documentation (safely ignoring all the code samples ;-) Approximately 1 document every 1-2 days. The documentation is long but do not under-estimate the value of doing this! There are some valuable insights and constraints in there - and when there's a section with the headline Note or Important do take note! I focused on the following documentation:
  1. AWS Cloudfront - Developer Guide
  2. AWS S3 - Developer Guide
  3. AWS EC2 - User Guide for Linux
  4. AWS EC2 - VM Import/Export User Guide
  5. AWS Management Portal for vCenter User Guide
  6. Amazon Kinesis Streams Developer Guide
  7. Amazon VPC Peering Guide
  8. AWS DirectConnect User Guide
  9. AWS Storage Gateway - User Guide
  10. Amazon Glacier - Developer Guide


During the final 2 weeks I also watched the A Cloud Guru Domain Wrap-up videos a couple of times; and even a few of the full videos where I felt this was warranted.

In the end, using this approach got me an 86% score on the final exam.

Some important take-away points from my experience were:
  • Leave yourself 2 weeks from taking the practice exam before you take the full exam
  • Pay attention to the percentage breakdown of questions per domain - and spend an apportioned amount of time studying that domain. Combined with the results of the practice exam I put together a spreadsheet that calculated a relevance score based on how well/badly I did. Formula: (100 - % my domain score) * % domain coverage / 100. The final 2 weeks I spent focusing studies on the 3 highest numbered domains
  • I think it is fair to say that real previous AWS and general Solution Architecture experience you certainly stand the best chance of passing the exam - or indeed doing well in it. I can't say how much experience in terms of time is required as that depends on many factors but I would certainly say a minimum of a couple of years total. 
  • Much of the AWS documentation and even some of the white papers now come in Kindle form. This is most excellent because it means that you can highlight sections for later review! And if a white paper or document is only in PDF format then why not send it to your Kindle conversion and delivery email address?
  • Some of the practice exam questions are buggy and vague; I must admit I wasn't impressed with the quality. I suspect that one would generally achieve a higher score in the real exam

P.S: I took approximately that same approach when studying for the AWS Certified DevOps Engineer Professional exam and this worked for me then as it did now.

12 March 2015

Getting PPTP VPN in Ubuntu working 14.04 with UFW enabled

Ok, a bit of a headscratcher. First of all, I didn't realise that UFW was blocking port 1723 traffic, which caused all VPN connection attempts to persistently fail - proof was to be found in /var/log/syslog and was easily remedied:

sudo ufw enable 1723

However, this still didn't resolve the issue. And it wasn't until some searching that I came across this article that I finally resolved the problem by following option 1 (adding "-A ufw-before-input -p 47 -j ACCEPT" to before.rules).


15 December 2014

Restoring Unity under Ubuntu 14.04

A few months back I installed the latest 14.04 Ubuntu updates and suddenly Unity stopped working - no launchpad, no menubars, no nothing...

I did a fair bit of searching the Interwebs and didn't find that one trick to restore Unity. So I gave up and installed Gnome instead, which ran fine.

However, the problem kept nagging me until tonight, when I finally took the plunge to figure out how to resolve.

It turns out that for whatever reason there was no /etc/X11/xorg.conf (some posts suggests that it all automagically reconfigures - didn't work so well for me). I therefore decided to try and create an xorg.conf.

The following wiki shows that if you run the following as root, it will generate a new xorg.conf: Xorg :0 -configure

One caveat: beforehand you will need to restart your PC and boot in recovery mode so you do not run X when attempting to run the Xorg configuration; then choose to run a root shell.

Then you will need to remount / as read/write:

mount -o remount,rw /

Now run:

Xorg :0 -configure

Now reboot.


If it's working, great!

If you had previously installed Gnome to revive your desktop (like me) then you can remove it (and thereby also restore the unity-greeter so that you're looking at Gnome login at every reboot) by running:

sudo apt-get remove gnome-session
sudo apt-get autoremove

23 January 2014

Removing the /etc/apache2 directory on Ubuntu - the revenge!

Ok, so you decide in your infinite wisdom to remove the /etc/apache2 directory altogether after a failed Chef cookbook attempt to create some new configuration against Apache 2.4 on Ubuntu 13.10 when the cookbook turns out only to support Apache 2.2.

That took some serious searching on the web in order to get Ubuntu to reinstate all configuration

Thank you Jorge Castro & ajmitch!

sudo apt-get -o DPkg::Options::="--force-confmiss" --reinstall  install apache2

Anyway, I also fixed up the apache2(.2) cookbook so that it can now run for 2.4. But I do disclaim all liability/warranty - it was simply a quick-fix using new syntax and constructs where warranted...

https://github.com/citizenme/chef-repo/tree/master/cookbooks/apache2