in technology news

My review on AWS Certified Developer Associate test

I took the exam last Friday and luckily passed with 919/1000 score.

My background

I have been working with Cloud environment for the several years already with AWS, OpenStack, Pivotal, and Other cloud providers such as digital ocean. Also, I’ve already had Jenkins Enterprise certificate and done many cloud native approaches through Jenkins.

I have been familiar with the concepts of many AWS services through my past work experience and other cloud provider’s services. So, I thought it would be easier for me to prepare the exam… a big mistake.

How I prepared my exam

Initially I started with Cloud Academy and took a practice exam to see where I am in AWS world without even studying the materials first. As I expected I failed and my score was below 70%. Good thing about those practice tests is that it shows my weakness.

So once I knew where I should study first, I started reading materials in those areas to have a clear picture of how things work in AWS. One of main areas that I studied hard was security(kms, server-side encryption, how SSE is implemented in various services, etc), unique features that each service provides through the study materials

Once I had better understanding, I would take the test again if I remembered all knowledge and repeated the same process over and over until I got it all.

After some time with Cloud Academy, I switched to Udemy AWS courses. After the switch, I felt like I started fresh again. For some reasons, UDemy material seemed to have recent information. Maybe I overlooked CA materials? lol Now I am glad that I switched and that made me look at the the official AWS documentations carefully for the up-to-date information.

It took about 1 month for me to prepare for the exam. It’s because I was previously exposed to some of AWS services as well as cloud best practices that I had implemented in the past.

Exam Experience

I took the exam via Pearson-vue online since the physical locations are all closed due to the COVID19.

Check-in process involves a few steps, which had me take 4 photos of my room (all 4 sides) and 2 photos of my ID (CA driver license). That has happened after the Vue tool checked my computer system.

I waited in the queue for my turn. When it was my time, I was initially connected, but my screen was stuck in black background and Vue app’s toolset (that had a chat icon and help?). I had waited for 15-20 mins and nothing happened. So that behavior convinced me that the software glitched and I had to restart the whole process. By the way, the app won’t let me quit the application with Command + Option + ESC on my mac. I’ve tried other tricks, but nothing worked. I rebooted my MacBook pro…

After the laptop rebooted into the normal screen, I went back to the Vue website and looked for my exam. Shockingly, it would not let me click on it because it says 1. My exam already started, 2. Or I missed my exam. I went to the help section of the site and could not find a link for an immediate help/support. I thought I wasted my money… And then I realized that I had to downloaded the application for the test. So, I looked for the application in my Downloads directory. It was my last hope. I double clicked on the app. It launched successfully, recognized my earlier check-in process(so I did not have to upload photos again), and put me in the waiting room right away. After a few minute of waits, I was connected to a real person who mentioned that someone already started my exam, but would start the exam for me again. As soon as the exam started, I checked the remaining time and it was full 2 hours and 9 minutes. Although the exam started 30 minutes late, I got my full exam time.

Was it worth taking the exam?

I believe it was. At least I got to learn a LOT of features of all AWS services that offer to the customers that the exam covers and got my foundation knowledge with AWS down, which is security, foundation service like EC2, and AWS serverless product offerings.

As for my next certificate goals, I am planning to take at least 2 or 3 more of AWS exams. SysOps, DevOps Pro, and Solution Architect Pro|Security Specialty. Also, one of certified developer benefits is 50% voucher on the next certificate exams. So why not, right? ūüôā

Mounting a Host Directory in Guest OS via open-vm-tools

I use VMware Fusion 11.5.1 to run ubuntu 18 server for many purposes. One of VMware Fusion features is an ability to share a host’s directory with the guest OS and it uses open-vm-tools utility.

However, after successful installation of an OS, the tool is automatically installed already and its service is up and running. However, the said host’s directory is no where to found 0at /mnt/hgfs.

To enable it, I had to run these commands:

$ sudo mount -t fuse.vmhgfs-fuse .host:/ /mnt/hgfs -o allow_other

## On this file
## /etc/fstab
## Append this line
.host:/ /mnt/hgfs fuse.vmhgfs-fuse allow_other 0 0

After that, the shared directory appears in /mnt/hgfs location.

Bulletproof WordPress via Nginx

I assume many developers consider WordPress as a joke since it’s made with “PHP”. However, WordPress is still powering a lot of websites. So quite often it is inevitable to do some work on a project that deals with WordPress.

Personally, I’ve had to deal with many WordPress sites and resolve security issues. The most common issues that I observed have been:

  • Backdoor attack to use the infected host to perform various types of attack
  • Stealing an admin cookie
  • Using the stolen cookie to post many dangerous posts
  • Using the stolen cookie to upload other scripts in wp-content directory
  • And so on

The most used attack paths the hackers/hacking tools seem to be wp-admin/(post-new|post|admin-post|).php and /wp-login.php.

Anyhow, the most impactful defense mechanism that I found was to whitelist IP address that belongs to a certain admin user. So far, nothing else has beaten that solution, so I call it a bulletproof Nginx config for WordPress site.

Here’s my Nginx config that I used for my clients to prevent hackers from attempting to intrude a WordPress site.

  location ~ /wp-admin/admin-ajax\.php$ {
    try_files $uri =404;
    fastcgi_split_path_info ^(.+\.php)(/.+)$;
    fastcgi_read_timeout 300;
    fastcgi_pass unix:/var/run/php/php7.0-fpm.sock;
    fastcgi_index index.php;
    fastcgi_param SCRIPT_FILENAME $document_root$fastcgi_script_name;
    include fastcgi_params;
    allow all;
  }

  location ~ (/wp-admin/.*\.php|wp-login\.php$) {
    try_files $uri =404;
    fastcgi_split_path_info ^(.+\.php)(/.+)$;
    fastcgi_read_timeout 300;
    fastcgi_pass unix:/var/run/php/php7.1-fpm.sock;
    fastcgi_index index.php;
    fastcgi_param SCRIPT_FILENAME $document_root$fastcgi_script_name;
    include fastcgi_params;

    ## whitelist IPs
    allow x.x.x.x;
    deny all;
    error_page 403 = @wp_ban;
  }

  location @wp_ban {
    rewrite ^(.*) https://mysite.com permanent;
  }

  location ~* /(?:uploads|files|wp-content|wp-includes|akismet)/.*.php$ {
    deny all;
    access_log off;
    log_not_found off;
  }

Sorting Tree Object in JavaScript, Not really

Today I had to output a tree like object structure in JavaScript. Initially, I came up with this code:

// for spacing
const tab = ( num ) => {
  let tab = "  "
  if ( num === 0 ) {
    return ''
  }
  for (let i=1; i < num; i++) {
    tab += tab
  }
  return tab
} 
// displaying
const display_directory = ( node, depth = 0 ) => {
  for ( let n in node ) {
    console.log(`${tab(depth)}${in}`)
    if ( Object.keys( node[ in ] ).length > 0 ) {
      display_directory( node[ in ], depth+1 )
    }
  }
}

It works perfectly fine, but the issue is that the order of names is going to be random as Object is not sorted. After some moments, I came up with a very simple solution to display name in alphabetical order.

const display_directory = ( node, depth = 0 ) => {
  let keys = Object.keys(node).sort()
  for ( let key of keys ) {
    console.log(`${tab(depth)}${key}`)
    if ( Object.keys( node[ key ] ).length > 0 ) {
      display_directory( node[ key ], depth+1 )
    }
  }
}

Glossary for SRE

This is a list of things for SRE, but mainly for myself. I easily forget terminologies and found myself look up terms even though I know what it is… (aging I guess?). So here’s the list of terms that I forget easily. (Not listing all the terms in SRE world since I know a lot of them already)

I don’t know why but I forget these so easily… >.<

  • Deployment Strategies
    • Canary Deployment
    • Rolling Deployment
    • Green Blue Deployment
  • MTTR – Mean Time To Repair
  • MTTF – Mean Time To Failure

Security alerts – github

First time I noticed Data Services on github repo and was so impressed with its Security alerts service. It’s free as of now, but I can kinda see where it may go… (cough cough blackduck… or a similar paid service)

Nonetheless, since I turned on Security alerts on some of my personal repos I have received so many emails warning about security vulnerabilities that exist in those repos. Also, there is another feature that github automatically submits a PR to fix the issue for you.

Code Hint annoyance and disable it

As a new programmer, I can see code hint being quite useful. However, as a seasoned programmer, those suggestions that Visual Studio Code pops up as I write code block the view of code that I am writing and I have to escape it so that I can see the rest of the code.

It is quite annoying and I’ve tried to find settings to disable it without googling and I lost. So I turned to google and found this stackoverflow thread.

Basically, need to add or update these in the user settings:

"editor.parameterHints.enabled": false,
"editor.suggest.snippetsPreventQuickSuggestions": false,
"html.suggest.html5": false,
"editor.snippetSuggestions": "none",

in Tips | 108 Words

Pact Broker setup with docker-compose

Almost all projects that I work with are set up through docker-compose by myself. This pact broker set up was no exception. There are so many advantages using docker-compose, specially on local environment.

Pact broker itself requires additional post, so I am not going to talk about it at all in this post…

These are the versions of docker and docker-compose respectively:

 macmini @ ~/projects/pact_broker * master
 [30] ? docker version
Client:
 Version:           18.09.1
 API version:       1.39
 Go version:        go1.10.6
 Git commit:        4c52b90
 Built:             Wed Jan  9 19:35:23 2019
 OS/Arch:           linux/amd64
 Experimental:      false


Server: Docker Engine - Community
 Engine:
  Version:          18.09.1
  API version:      1.39 (minimum version 1.12)
  Go version:       go1.10.6
  Git commit:       4c52b90
  Built:            Wed Jan  9 19:02:44 2019
  OS/Arch:          linux/amd64
  Experimental:     false


macmini @ ~/projects/pact_broker * master
 [31] ? docker-compose version
docker-compose version 1.17.0, build ac53b73
docker-py version: 2.5.1
CPython version: 2.7.13
OpenSSL version: OpenSSL 1.0.1t  3 May 2016

my pact broker project’s directory structure

docker-compose.yml content

version: '3.2'
services:
postgresdb:
build:
context: ./db
dockerfile: ./db/Dockerfile
expose:
- "5432"
volumes:
- ./db/data:/var/lib/postgresql/data
environment:
- PACTBROKER_USER_PASSWORD
- POSTGRES_PASSWORD
- POSTGRES_USER=admin
- PGDATA=/var/lib/postgresql/data/pgdata
restart: 'always'
pactbroker:
image: dius/pact-broker
links:
- "postgresdb:postgresdb"
environment:
      - "PACT_BROKER_DATABASE_PASSWORD=${PACTBROKER_USER_PASSWORD}"
      - "PACT_BROKER_DATABASE_USERNAME=pactbrokeruser"
      - "PACT_BROKER_DATABASE_HOST=postgresdb"
      - "PACT_BROKER_DATABASE_NAME=pactbroker"
ports:
- "8081:80"
restart: 'always'

One thing to note here is that I’ve hardcoded database name, database host, and pact broker’s database username since it’s local environment for development purpose. In production like environment, probably all of information can be pulled from environment rather than hardcoding them.

So on local environment, only information that you should set are PACTBROKER_USER_PASSWORD and POSTGRES_PASSWORD before running docker-compose up or docker-compose up -d.

Dockerfile file content

  [53] ? cat Dockerfile 
## Base image
FROM postgres

## Of course, every Dockerfile requires a maintainer
MAINTAINER me

## Add a special script to initialize db for pact broker
ADD ./initial.sh /docker-entrypoint-initdb.d/

Dockerfile itself is pretty simple as shown above since only operation it has to do is to create a new database, new user, and grant the user full permission to the new database for pact broker and there is the entrypoint script that deals with that.

initial.sh file content

 #!/bin/bash
set -e

psql -v ON_ERROR_STOP=1 --username admin <<-EOSQL
  CREATE USER pactbrokeruser WITH PASSWORD '$PACTBROKER_USER_PASSWORD';
  CREATE DATABASE pactbroker WITH OWNER pactbrokeruser;
  GRANT ALL PRIVILEGES ON DATABASE pactbroker TO pactbrokeruser;
EOSQL

That is it! The above script will be executed once when a new container is created via docker-compose and it will create a new user, a new database, and grant the user all privileges to the database.

Start Up!

Once those are in place, get back to the root of the project file and execute this:

docker-compose up

Only reason why I am not executing it in background mode is to see if there is any issue starting up the broker. Once confirming log stays calm, fire up your browser (chrome browser in my case), type 127.0.0.1:8081 in the browser, and witness the pact broker coming up on your browser.

The default port for the pact broker’s image is 80, but I had to change that to something else because I already had http server up and running at the port 80 and had to avoid the port conflict.

If you do not have anything up and running, you should change the port section in the docker-compose to “80:80” from “8081:80”

in Tips | 670 Words