](https://gagor.pro/generic-cover.webp)
List octal file permissions in bash
A quick tip on how to list files with their octal permissions in bash using the stat command.
](https://gagor.pro/generic-cover.webp)
A quick tip on how to list files with their octal permissions in bash using the stat command.
](https://gagor.pro/generic-cover.webp)
I’m playing a lot with Docker lately. Building images, and then rebuilding, and then building again… It’s pretty boring. To automate this task a little I used inotify to build automatically after I changed any file. This trick could be used in many different situations. On Linux You will need inotify-tools package: Install inotify-tools on Linux sudo apt-get install -y inotify-tools Then run something like this: ...
](https://gagor.pro/generic-cover.webp)
A quick guide to installing and using Docker Compose for automating multi-container Docker applications, including command examples and a sample configuration file.
](https://gagor.pro/generic-cover.webp)
From few days I have access to UPC’s www.horizon.tv external link platform - until now it was useless on Linux. But there is Pipelight that will use Wine to emulate Silverlight on Linux and it’s working pretty well - you’re just few commands away from achieving that: # stop browser killall firefox # remove old version if you have it sudo apt-get remove pipelight Now configure repos and install packages: ...
](https://gagor.pro/generic-cover.webp)
I just bought new wifi card for my desktop computer. Like in topic, it’s Intel Dual Band Wireless-AC 7260 for Desktop external link . I was searching for card that: support AC standard have 5GHz network support (2,4GHz channels are cluttered heavily in my neighborhood have PCI/PCIx or USB3 connector is Linux friendly (no modules compilation by hand, support for aircrack-ng, kismet) This one is the only I found that comply my expectations. ...
](https://gagor.pro/generic-cover.webp)
There are many possible real life cases and not all optimization technics will be suitable for you but I hope it will be a good starting place. Also you shouldn’t copy paste examples with faith that they will make your server fly 😃 You have to support your decisions with excessive tests and help of monitoring system (ex. Grafana ). Cache static and dynamic content Setting caching static and dynamic content strategy may offload your server from additional load from repetitive downloads of same, rarely updated files. This will make your site to load faster for frequent visitors. ...
](https://gagor.pro/generic-cover.webp)
Sometime you need to make quick and dirty image backup of VM running on XenServer and this post is about such case 😃 List machines: xl list Name ID Mem VCPUs State Time(s) Domain-0 0 4066 8 r----- 3526567.3 webfront1.example.com 1 4096 4 r----- 3186487.2 webfront2.example.com 2 2048 2 -b---- 920408.2 Now you may export one: xe vm-export vm=webfront1.example.com filename=/srv/backup/webfront.xva Export succeeded You may also use uuid for that - list machines with xe vm-list (best with less) and then: ...
](https://gagor.pro/generic-cover.webp)
Sometimes deployment process or other heavy task may cause some Nagios checks to rise below normal levels and bother admin1. If this is expected and you want to add downtime on host/service during this task you may use this script: #!/bin/bash function die { echo $1; exit 1; } if [[ $# -eq 0 ]] ; then die "Give hostname and time in minutes as parameter!" fi if [[ $# -eq 1 ]] ; then MINUTES=15 else MINUTES=$2 fi HOST=$1 NAGURL=http://nagios.example.com/nagios/cgi-bin/cmd.cgi USER=nagiosuser PASS=nagiospassword SERVICENAME=someservice COMMENT="Deploying new code" export MINUTES echo "Scheduling downtime on $HOST for $MINUTES minutes..." # The following is urlencoded already STARTDATE=`date "+%d-%m-%Y %H:%M:%S"` # This gives us the date/time X minutes from now ENDDATE=`date "+%d-%m-%Y %H:%M:%S" -d "$MINUTES min"` curl --silent --show-error \ --data cmd_typ=56 \ --data cmd_mod=2 \ --data host=$HOST \ --data-urlencode "service=$SERVICENAME" \ --data-urlencode "com_data=$COMMENT" \ --data trigger=0 \ --data-urlencode "start_time=$STARTDATE" \ --data-urlencode "end_time=$ENDDATE" \ --data fixed=1 \ --data hours=2 \ --data minutes=0 \ --data btnSubmit=Commit \ --insecure \ $NAGURL -u "$USER:$PASS"| grep -q "Your command request was successfully submitted to Nagios for processing." || die "Failed to con tact nagios"; echo Scheduled downtime on nagios from $STARTDATE to $ENDDATE Threat this script as template with some tips: ...
](https://gagor.pro/generic-cover.webp)
Now when you have CollectD and InfluxDB installed you may configure Grafana 😃 First configure repo with current Grafana version (select your distro): curl https://packagecloud.io/gpg.key | sudo apt-key add - deb https://packagecloud.io/grafana/testing/debian/ wheezy main Now install package (on wheezy I needed to install apt-transport-https to allow installation of packages from repo via HTTPS): apt-get update apt-get install -y apt-transport-https apt-get install -y grafana By default Grafana will use sqlite database to keep information about users, etc: ...
](https://gagor.pro/generic-cover.webp)
I wanted/needed some statistics on few my machines. I saw earlier grafana and was impressed so this was starting point. Then I started reading about graphite, carbon and whisper, and then… I found InfluxDB. Project is young but looks promising. Let’s start! On project page there is no info about repo but it’s available, configure it: curl -sL https://repos.influxdata.com/influxdb.key | apt-key add - echo "deb https://repos.influxdata.com/debian wheezy stable" > /etc/apt.sources.list.d/influxdb.conf for Ubuntu use url like (of course selecting your version): ...