Category Archives: How-Tuesday

How to send email from the command line with SendGrid

Sending email should be simple. It's something I really don't like thinking about.

I ran into an issue where I wanted to send email from the command line in a bash script that helps power Review Signal. A notification when something goes wrong.

The only problem, I don't have any mailing program installed. I also don't really want to send email from my servers because making sure they get through isn't something I want to spend a lot of time thinking about.

The first solution that jumped to mind was SendGrid, who I had the pleasure of trying out their API at PayPal BattleHack DC. It was dead simple. I had integrated it into PHP though. I didn't know if it worked from the command line.

I checked the docs and found they had a rest API.

curl -d 'to=destination@example.com&toname=Destination&subject=Example Subject&text=testingtextbody&from=info@domain.com&api_user=your_sendgrid_username&api_key=your_sendgrid_password' https://api.sendgrid.com/api/mail.send.json

If you want to clean it up with variables:

#!/bin/sh
SGTO=receiver@example.com
SGTONAME='Some Name'
SGSUBJECT='Email Subject'
SGFROM=from@example.com
SGTEXT='Email Text'
SGUSER=user
SGPASS=password
curl -d "to=${SGTO}&toname=${SGTONAME}&subject=${SGSUBJECT}&text=${SGTEXT}&from=${SGFROM}&api_user=${SGUSER}&api_key=${SGPASS}" https://api.sendgrid.com/api/mail.send.json

Voila! Sending emails from my bash script is now simple.

How To Remove Files from Git Repo Permanently

I ran into this issue in the past week. My .PACK file was enormous (many many gigabytes) because I committed a lot of files I shouldn't have in the past. Large binary files, backups and other things that don't belong in a Git repo.

I deleted the files and committed those, but Git is really good at making sure you never lose data. So despite having a small repo, I had a .PACK file that was 10 times the size of the repo.

I googled around a lot and the best post I could find on StackOverflow was completely remove files from git repo and github on remote. It didn't quite work for me though.

git filter-branch -f --index-filter 'git rm -r --cached --ignore-unmatch "filename"' --prune-empty HEAD
rm -rf .git/refs/original/ && git reflog expire --expire=now --all && git gc --aggressive --prune=now

I just had to add a few flags to get this working.

-f

and

--prune-empty

in the git filter-branch.

--expire=now

and

--prune=now

in the second line of commands.

Now you can run those two commands (replacing filename with a file name or directory) and it should permanently remove the file from the repo including the PACK file.

Be very careful when you run this command, I am not sure there is a way to undo this change.

How To Configure MySql InnoDB to Save Tables In Unique Files

MySql is a great database and InnoDB is one of the engines you can use; the other popular choice and default is MyISAM (a good post explaining the differences). If you work with large amounts of data you can sometimes run into weird problems you didn't know about or expect.

InnoDB saves all the tables in your database into one file: ibdata1 (and an ib_log file). That's all well and good, except it doesn't delete information from them when you erase a table. So your ibfile1 will never decrease in size. This only becomes a problem with large amounts of data and the use of temporary tables. I don't know why this is the default behavior, but there is a very simple solution to get around this problem: use unique files for each table.

Instructions

Make sure you backup everything before doing this and I don't recommend doing this on a live system.

1. Open up your my.cnf (mysql configuration file)

nano /etc/mysql/my.cnf

If you don't know where it is try searching for it:

find / -name my.cnf

2. Add the following under [mysqld] portion of the config:

innodb_file_per_table

3. Remove old files: ibdata1 and ib_log

4. Restart mysql (a couple examples below)

service mysqld restart
/etc/init.d/mysqld restart

MySql will now save innodb tables into unique files and delete them when you delete the tables.

How to Backup/Export Your MySql Database(s)

I will teach you two ways to backup/export your mysql database(s). The first option is using phpMyAdmin; phpMyAdmin is generally installed on most shared hosting and is bundled with cPanel. The second method is SSH; this requires SSH access (often found on VPS/Dedicated servers).

Backing Up Your Database With phpMyAdmin

  1.  Login to your phpMyAdmin. If you are using cPanel, it has its own icon on the dashboard when you login.
  2.  Click on the database you want to backup/export.
  3.  You should see a screen like this:phpmyadmin
  4. Click on the Export tab at the top.

The default options are generally good. It will save both your schema (database design) and data (database content). However, if the database is large, you probably want to choose 'Save as a File' and pick a compression type (zip/gzip). This will let you download the file and have it take less space. If it's small and you just want the code directly, don't change this.

If you want to import the data, just click the import tab (instead of export) and upload the file.

Backing Up Your Database With SSH

We are going to use mysqldump which comes with mysql.

  1.  Login to your server via SSH.
  2. Type in the following command replacing YOURUSER, YOURPASSWORD and YOURDATABASE with your mysql username, mysql password and the database you want to backup. Also change /path/to/output/file.sql to the location you want to save the backup.
mysqldump –quick –user=YOURUSERNAME –password=YOURPASSWORD YOURDATABASE < /path/to/output/file.sql

If you want to backup all your databases instead of a specific database, replace YOURDATABASE with –all-databases

Once the command finishes running, your backup is in the .sql file you specified.

If you want to automate backups of your database, there is a nifty MySql backup to Amazon's S3 file storage script here: https://github.com/woxxy/MySQL-backup-to-Amazon-S3

Loading...

Interested in seeing which web hosting companies people love (and hate!)? Click here and find out how your web host stacks up.