четверг, 9 ноября 2017 г.

Bash: MySql backup (file per db), restore+ users and privileges

Backup Mysql DB (file per db)

#!/bin/bash
USER="root"
databases=`mysql -u $USER -e "SHOW DATABASES;" | tr -d "| " | grep -v Database`
for db in $databases; do
    if [[ "$db" != "information_schema" ]] && [[ "$db" != "performance_schema" ]] && [[ "$db" != "mysql" ]] && [[ "$db" != _* ]] ; then
        echo "Dumping database: $db"
        mysqldump -u $USER --databases $db > `date +%Y%m%d`.$db.sql
    fi
done

Restore to RDS
!/bin/bash
USER="root"
PASSWORD=""
HOST="db1-web-eu-west-1.rds.amazonaws.com"
FILES="/opt/dumps/*"
for f in $FILES
do
  echo "Processing $f file..."
  mysql -h $HOST -u $USER -p$PASSWORD < $f
done
Users and privileges:

  • some dark magic

mysql -u root -e"select concat('show grants for ','\'',user,'\'@\'',host,'\'') from mysql.user" > user_list_with_header.txt
sed '1d' user_list_with_header.txt > ./user.txt
while read user; do  mysql -u root -e"$user" > user_grant.txt; sed '1d' user_grant.txt >> user_privileges.txt; echo "flush privileges" >> user_privileges.txt; done < user.txt
awk '{print $0";"}'  user_privileges.txt >user_privileges_final.sql

  • delete everything, related to  root

sed -i.bak '/root/d' user_privileges_final.sql

  • cleanup 

rm user.txt user_list_with_header.txt user_grant.txt user_privileges.txt user_privileges_final.sql.bak

вторник, 19 сентября 2017 г.

How to send bash script output via email using AWS SES


  • Create SES SMTP user

https://console.aws.amazon.com/iam/home?#/s=SESHome

We need

SMTP Username:
AXXXXXXXXXXXXQ
SMTP Password:
AsFbCCCXXXXXXXXXXXCXXXXXXXXXXXqQ6

  • Validate email to send FROM
SES - Email Addresses - Verify a New Email Address
  • Create custom policy in IAM in order to give permissions to send emails and assign to a role. Assign role to instance.
{
    "Version": "2012-10-17",
    "Statement": [
        {
            "Effect": "Allow",
            "Action": [
                "ses:SendEmail",
                "ses:SendRawEmail"
            ],
            "Resource": "*"
        }
    ]
}
  • install mailx package ( yum install mailx) and test it :
-v verbose
-s Subject
-S settings
-S nss-config-dir can be found by  find / -name "cert*.db", /etc/pki/nssdb/ in my case
-S from "EMAIL@VALIDATED.DOMAIN" - watch step 2

mailx -v -s "Test" -S smtp-use-starttls -S ssl-verify=ignore -S smtp-auth=login -S smtp=smtp://email-smtp.us-east-1.amazonaws.com:587 -S from="EMAIL@VALIDATED.DOMAIN" -S smtp-auth-user=AXXXXXXXXXXXXQ -S smtp-auth-password= AsFbCCCXXXXXXXXXXXCXXXXXXXXXXXqQ6 -S nss-config-dir=/etc/pki/nssdb/ USER@DOMAIN.RECIPIENT

  • Edit global mailx conf (/etc/mail.rc)  and add account: 

account ses {
        set smtp-use-starttls
        set ssl-verify=ignore
        set smtp-auth=login
        set smtp=smtp://email-smtp.us-east-1.amazonaws.com:587
        set from="EMAIL@VALIDATED.DOMAIN"
        set smtp-auth-user= AXXXXXXXXXXXXQ
        set smtp-auth-password= AsFbCCCXXXXXXXXXXXCXXXXXXXXXXXqQ6
        set nss-config-dir=/etc/pki/nssdb/
}

  • Test mailx global config
echo -e "Test Mail body text" | mailx -v -s "Test" -A ses USER@DOMAIN.RECIPIENT

  • Create script and add it to cron

Linux: find and delete files periodically

Create script, that finds all files, older than 30 minutes:

#!/bin/bash

TARGET_FOLDER="/var/somedir/Archive"
cd $TARGET_FOLDER
find . -type f -mtime +30 -exec rm -rf {} \;


Add to cron to run script every day at 04:00

0 4 * * * /bin/bash /scripts/SCRIPT LOCATION


понедельник, 11 сентября 2017 г.

How to access MSSQL DB from Amazon Linux and to do a failover

To access MSSQL DB unixODBC is needed.
Amazon yum repo has unixODBC-2.2.14-14.el6.x86_64, but unixODBC-2.3.2 or more is needed.
unixODBC-2.3.2 and mssql-tools-14 could be found in official MS repo.


  • 1) Delete existing unixODBC (version is old, and mssql-tools can't use it)

sudo yum remove unixODBC*


  • 2) Download repo list

curl https://packages.microsoft.com/config/rhel/6/prod.repo > /etc/yum.repos.d/mssql-release.repo


  • 3) Install packages 

sudo ACCEPT_EULA=Y yum install msodbcsql-13.0.1.0-1 mssql-tools-14.0.2.0-1


  • 4) Not mandatory, but it may be used by mssql sensitive applications (zabbix-mysql for example, it uses unixODBC 2.2) 

sudo yum install unixODBC-utf16-devel


  • 5) Create symlinks for convenient usage

ln -sfn /opt/mssql-tools/bin/sqlcmd-13.0.1.0 /usr/bin/sqlcmd
ln -sfn /opt/mssql-tools/bin/bcp-13.0.1.0 /usr/bin/bcp


  • 6) To connect to DB use:
sqlcmd -S IP.IP.IP.IP,1433 -U 'USERNAME' -P 'PASSWORD'
  • 7) To execute something:
SELECT * FROM sys.databases
GO
  • 8) To run SQL in script (to do a failover in that case) :
/usr/bin/sqlcmd -S IP.IP.IP.IP,1433 -U 'USERNAME' -P 'PASSWORD' -Q "ALTER DATABASE ffmain SET PARTNER FAILOVER"

четверг, 10 августа 2017 г.

GCP exam questions, as far as I remember

  • General:
    Total questions : 50
    Two case studies : 20 questions
    CI/CD - 1
    Storage/Database - 15
    General google services - 3
    Appengine vs GKE vs GCE usage -
    Preemptible instances / AutoScaling  / Global network
    Monitoring / stackdriver  - 5

    Exam subjects:
    CI/CD:
    • The correct format of the template for cloud deployment  - yaml (+), jinja (+ )
    Storage:
    • Difference between storage tiers
    • Quite a bit questions that ask about what is the best storage / database to use for customer needs.
    • VM Disks difference, which is faster, which is more reliable

    GCE:
    • AutoScaling triggers
    • Connection between 2 GCP projects - sharing resources, vpn connections, vpc
    • Instance Tags for firewall
    • ELB question about difference between HTTP(S) and UDP TCP (one is regional and one is cross regional)
    • Preemptable instaces (VMs)  question - usage
    • Company wants to run their J2EE application, where is the best to deploy it?
    • GCE
    • GAE Standard / Flex
    • GKE
    • External IP addresses - reserving

    • Slow connection between on-prem and GCP - direct connect, multivpn,....
    • Network : global and spans different zones (regions contain zones)
    • Default firewall - allows alot. Custom firewall - everything is forbidden
    • Company X want move to cloud and they have X configurations on-prem. What is the best way to build their environment in the cloud?
    • Start with smallest VMs and scale up
    • Start with biggest VMs and scale down
    • Go to their on-prem, see their configuration and build similar in the cloud using custom VM conf
    • Estimate VMs (cpu, ram…) they will need and build costume VMs according to estimation
    • Tags

    Monitoring/Logging
    • Developers are not satisfied with logging of their application. What would you suggest to them?
    • Company wants monitor and debug. What stackdriver options will they need? And than you are presented with some real, not real options and some are real but not needed.

    DBs / BigData
    • Quite a bit questions that ask about what is the best storage / database to use for customer needs.
    • Company can’t tolerate ANY downtime at all. How do you copy their database
    from on-prem to google cloud?
    • Cloud SQL - Spanner - BigQuery ( understand the usage, according to the size of the project, company and amount of data)
    • Cloud SQL doesn’t work with private IP, only public (currently)
    • Developers know SQL but you need to operate with noSQL DB. What you use?
    • If you see Spark or / and Hadoop - the answer is Dataproc
    • 1-2 questions about Dataflow
    • IoT - weather station data streaming - Pub/sub or Dataflow
    • ColdLine, nearline, difference in prices, frequency and cost efficiency


    Security:
    • IAM roles and permissions difference
    • Company has DEV and Prod env. Split them into different projects, not accounts
    • Can you create account for GCP for a user, who has no gmail or g suite account
    • G suite and GCP connection - IAM usage


    Other services / General cloud knowledge:
    • Cloud DNS - the correct name and what it used for
    • Lables
    • People are complaining that news feed they see got all news marked as read.
      * INSERT PYTHON CODE*
      What could cause it?




    GKE & APP ENGINE - Coursera , course #4

    GKE
    • Master node…….. One and autom. Created in GKE
    • Full resource consumption - what to use to use 100% of resources  (containers)

    APP ENGINE:
    • Flex and standard env in GAE : difference in usage
    • Ability to run containers (GAE flex)
    • Opsless infrastructure - suggestions
    • “Need to run APP, written in specific ver. Of PHP without rolling VM’s” (GAE flex)

воскресенье, 6 августа 2017 г.

Execute bash script on different hosts (ip list in file) via ssh

for host in $(cat /home/host_list | grep IP | egrep -v '^#' | awk '{print $3}' | sort);do

HostnameIP=$(echo "$host" | tr -d "[a-z]");
echo "##### $HostnameIP######";
sleep 4
ssh -i sshkey ec2-user@$host < bash_script

done

четверг, 3 августа 2017 г.

mongodb backup - bash

#!/bin/bash

DATESTAMP=$(date +%Y%m%d_%H%M)
LOG_FILE=/var/log/mongodb_backup/mongodb_backup.log
[[ ! -d /var/log/mongodb_backup ]] && mkdir /var/log/mongodb_backup
set -x
exec >${LOG_FILE} 2>&1

PID=$$
PID_FILE=/var/run/db_backup.pid
DB_HOST= DB_HOST_IP
DB_NAME='DB_NAME'
DUMP_PATH=/mongobck/tmp
DUMP_NAME=${DB_NAME}_${DATESTAMP}
S3_PATH='s3://mongodb-backups/'
LOGGER_TAG='mongodb_backup'

echo ""
echo "$(date): Starting MongoDB Backup"

[[ -f ${PID_FILE} ]] && (logger -sit ${LOGGER_TAG} "Instance of script already running. Exiting." && exit 99)
echo ${PID} > ${PID_FILE}

# Perform dump of mongo db
logger -sit "${LOGGER_TAG}" "Starting MongoDB backup of ${DB_NAME}"

/usr/bin/mongodump -h ${DB_HOST} --out ${DUMP_PATH}

logger -sit "${LOGGER_TAG}" "MongoDB Dump return $?."

# Compress mongo db
cd /mongobck/
tar -czvf ${DUMP_NAME}.tgz ${DUMP_PATH} && rm -rf ${DUMP_PATH}/*

# Copy compressed mongo db dump to s3
aws s3 cp /mongobck/${DUMP_NAME}.tgz ${S3_PATH} && rm -rf /mongobck/${DUMP_NAME}.tgz
logger -sit "${LOGGER_TAG}" "Backup of MongoDB completed successfully."

echo "$(date): MongoDB Backup Complete"
rm -f $PID_FILE
exit 0

Bash: MySql backup (file per db), restore+ users and privileges

Backup Mysql DB (file per db) #!/bin/bash USER="root" databases=`mysql -u $USER -e "SHOW DATABASES;" | tr -d "|...