Monitoring Linux and Windows hosts with Glances

Sysadmins have many tools to view and manage running processes: topatop, and htop. All of these tools monitor CPU and memory usage, and most of them list information about running processes Glances, also monitors filesystem I/O, network I/O, and sensor readouts that can display CPU and other hardware temperatures as well as fan speeds and disk usage by hardware device and logical volume.

To start Glances on a Linux host, open a terminal session and enter the command glances. Glances has three main sections—Summary, Process, and Alerts—as well as a sidebar. I'll explore them and other details for using Glances now.

Summary section

The Summary section above provides an overview of the system's status. The first line shows the hostname, the Linux distribution, the kernel version, and the system uptime.

The next four lines display CPU, memory usage, swap, and load statistics. The left column displays the percentages of CPU, memory, and swap space that are in use. It also shows the combined statistics for all CPUs present in the system.

Memory

The Memory portion of the Summary section contains statistics about memory usage.

Postgres - Getting started

Show all tables

select table_schema, table_name
from information_schema.tables
order by 1, 2;

Describe a table

\d+ tablename

Which by the client is converted to:

select column_name, data_type, character_maximum_length
from INFORMATION_SCHEMA.COLUMNS where table_name = '<name of table>';

To get out of Postgres

\q

Using Apache Airflow to build reusable ETL on AWS Redshift

From  DORIAN BEGANOVIC APACHE

Building a data pipeline on Apache Airflow to populate AWS Redshift

In this post we will introduce you to the most popular workflow management tool - Apache Airflow. Using Python as our programming language we will utilize Airflow to develop re-usable and parameterizable ETL processes that ingest data from S3 into Redshift and perform an upsert from a source table into a target table. We will also show how to deploy and manage these processes using Airflow.

Overview of Apache Airflow

Apache Airflow in an open-source workflow manager written in Python. It was originally created by Maxime Beauchemin at Airbnb in 2014. The project joined the Apache Software Foundation’s incubation program in March 2016. Currently it has more than 350 contributors on Github with 4300+ commits.

The main services Airflow provides are:

Troubleshoot Nginx: 10 typical errors

By Pedro Pessoa, Operations Engineer at Server Density.
Published on the 7th July, 2016.

1. Check the configuration for syntax errors or warnings:

$ sudo service nginx configtest
nginx: the configuration file /etc/nginx/nginx.conf syntax is ok
nginx: configuration file /etc/nginx/nginx.conf test is successful

In case there are any issues, the output will specify the file and line number on which it occurred:

$ sudo service nginx configtest
"worker_connections" directive is not allowed here in /etc/nginx/nginx.conf:12
nginx: configuration file /etc/nginx/nginx.conf test failed

Nginx also provides a -t switch to test the configuration files if the service command is not available on your system:

$ sudo nginx -t

2. Is Nginx running?

Check the status of the Nginx service:

$ sudo service nginx status
* nginx is running

Check the status of the Nginx systemd service: