All posts by erickclasen

About erickclasen

This is my Blog for writing about technical items and my other thoughts. I also use it to experiment with WordPress.

Kombucha with SCOBY

Kombucha SCOBY Timelapse Video

I ran a webcam pointed at a fermentation of Kombucha capturing frames using fswebcam running on a Raspberry Pi server. It ran for a few weeks.

You can see the Kombucha SCOBY’s forming in the video. The setup is behind the PC monitor with a desk lamp and camera pointing towards the back of the desk. This gives fairly steady ambient lighting. I grew a dozen SCOBY’s for the Summer Fermentation Class that was held on June 29th 2015.

SCOBY Timelapse Video AVI Format 8 frames/sec

The video should be viewable most of the time unless I am servicing the Raspberry Pi or I pulled the plug on it during a nasty storm.

PHPList config.php Tweaks

PHPList Config File

PHPList works great for large email lists but, I found out that the configuration file can be tweaked to make certain aspects of it work even better.

The file to modify is config.php located under the PHPList/config folder which should be underneath the public_html folder on most installations.

 Batch Processing Tweak

I have found that I have to frequently hit process queue on PHPList to get the entire email list to process. Recently I found that changing the following settings has made a big improvement in it’s performance. My service provider limits emails to 500 per hour so I set the MAILQUEUE_BATCH_SIZE at 480 and the batch period at 3600. Now it will send out the messages as fast as possible up to the limit of 480. It seems to perform better if it can “get the messages out” fairly quickly. It will send out a bunch with a 2 second spacing that I have set ( see below ) and it seems to get them out most of the time before what ever timing was running out that was making me have hit process queue again.

 

## if you send the queue using commandline, you can set it to something that complies with the
## limits of your ISP, eg 300 messages an hour would be 
define("MAILQUEUE_BATCH_SIZE",480);
define("MAILQUEUE_BATCH_PERIOD",3600);

Make sure the MAILQUEUE_BATCH_SIZE and MAILQUEUE_BATCH_PERIOD is commented out, it might conflict with what is done above….

## if you send the queue using your browser, you may want to consider settings like this
## which will send 10 messages and then reload the browser to send the next 10. However, this
## will not restrict the sending to any limits, so there's a good chance you will
## go over the limits of your ISP
#define("MAILQUEUE_BATCH_SIZE",10);
#define("MAILQUEUE_BATCH_PERIOD",1);

So I don’t overload the server by trying to send out messages really fast I have set a slight delay…

# to avoid overloading the server that sends your email, you can add a little delay
# between messages that will spread the load of sending
# you will need to find a good value for your own server
# value is in seconds, and you can use fractions, eg "0.5" is half a second
# (or you can play with the autothrottle below)
define('MAILQUEUE_THROTTLE',2);

Queue Processing Reporting

It is also nice to see the results and know that the message actually went out. So I have it set to report this result…

# after every run of the queue to send out messages, phpList will send a summary to the
 # admin address. If you want to stop this, set this to false or 0
 define('SEND_QUEUE_PROCESSING_REPORT',true);

 

Send Only One Test Email

If you send a lot of test emails to yourself and it is a pain to get a text and HTML every time you can restrict it to just sending one. The one it will send is based on the way the user preferences are set up.

# test emails
 # if you send a test email, phplist will by default send you two emails, one in HTML format
 # and the other in Text format. If you set this to 1, you can override this behaviour
 # and only have a test email sent to you that matches the user record of the user that the
 # test emails are sent to
 define('SEND_ONE_TESTMAIL',1);

 Click Tracking

Click tracking is a method of tracking where email recipients go based on the links in the body of the email. You can track a decent amount of information on who goes to what links and even the timing of it. I have found this feature very useful as it can give an idea of how well your email is penetrating the market. To use this feature you have to use the [CLICKTRACK] directive in the message body or in a template.

# Click tracking
# If you set this to 1, all links in your emails will be converted to links that
# go via phplist. This will make sure that clicks are tracked. This is experimental and
# all your findings when using this feature should be reported to mantis
# for now it's off by default until we think it works correctly
define('CLICKTRACK',1);

# Click track, list detail
# if you enable this, you will get some extra statistics about unique users who have clicked the
# links in your messages, and the breakdown between clicks from text or html messages.
# However, this will slow down the process to view the statistics, so it is
# recommended to leave it off, but if you're very curious, you can enable it
define('CLICKTRACK_SHOWDETAIL',1);

 

Attachments

It is worth turning on the attachments feature in PHPList to be able to add attachments to the email. Turning it on creates a new tab in the PHPList menu while you are creating a message that allows you to add an attachment.

# attachments is a new feature and is currently still experimental
# set this to 1 if you want to try it
# caution, message may become very large. it is generally more
# acceptable to send a URL for download to users
# if you try it, it will be appreciated to give feedback to the
# users mailinglist, so we can learn whether it is working ok
# using attachments requires PHP 4.1.0 and up
define("ALLOW_ATTACHMENTS",1);

# if you use the above, how many would you want to add per message (max)
# You can leave this 1, even if you want to attach more files, because
# you will be able to add them sequentially
define("NUMATTACHMENTS",1);

# when using attachments you can upload them to the server
# if you want to use attachments from the local filesystem (server) set this to 1
# filesystem attachments are attached at real send time of the message, not at
# the time of creating the message
define("FILESYSTEM_ATTACHMENTS",1);

Bounce Processing

I have set the purge and purge unprocessed to 1 that way the mailbox doesn’t fill up. I want it to empty when the bounces are processed every month. I set the unsubscribe threshold to 15 instead of the normal 5. This way it really gives a lot of chances for the person to straighten out their email before it stops sending to them. It is no big deal to let it send mail that will bounce back for a while, in my opinion. But eventually I want the mailing list cleaned of deadwood, even if it takes a year it is OK.

# set this to 0 if you want to keep your messages in the mailbox. this is potentially
 # a problem, because bounces will be counted multiple times, so only do this if you are
 # testing things.
 $bounce_mailbox_purge = 1;
# set this to 0 if you want to keep unprocessed messages in the mailbox. Unprocessed
 # messages are messages that could not be matched with a user in the system
 # messages are still downloaded into PHPlist, so it is safe to delete them from
 # the mailbox and view them in PHPlist
 $bounce_mailbox_purge_unprocessed = 1;
# how many bounces in a row need to have occurred for a user to be marked unconfirmed
 $bounce_unsubscribe_threshold = 15;

Wishlist

I would like to find a way to increase the timeout for PHPList when editing a message. The timeout that I am referring to is the one that occurs if you don’t save the draft message periodically and one hour goes by. Then you wind up losing your work as the PHPList makes you resign in. So if you forget to save or get interrupted and come back later, a lot of work can be lost accidentally. Even if it had a feature such as WordPress where it periodically saved a draft, that would be a big help.

Tux Favicons

While I was setting up and testing one of my servers I noticed that when I access it using Opera on a Kindle, Opera tries to grab a favicon.ico file, which was missing. I noticed the file was missing when I was browsing through the Apache error log on the server. In the log there is a missing file complaint about the favicon.ico whenever I browsed it using Opera.

So I found some graphics of Tux the Penguin and the Raspberry Pi raspberry and made up some quick favicons for the web pages on the Raspberry Pi Server and my main server.

I have loaded them up here in large format and the favicons as well. All the Tux pics come from Linux 2.0 Penguins, see this page for the original art. I used GIMP to edit them which worked well.

Two 300 x 300 pixel examples of the Favicons

Tux Server Favicon
Tux Server Favicon
Tux Raspberry Pi Favicon
Tux Raspberry Pi Favicon

 

Favicon ICO Files

http://oils-of-life.com/blog/wp-content/uploads/2015/07/tux-raspberry.ico

http://oils-of-life.com/blog/wp-content/uploads/2015/07/tux-favicon-32.ico

http://oils-of-life.com/blog/wp-content/uploads/2015/07/tux-favicon-16.ico

http://oils-of-life.com/blog/wp-content/uploads/2015/07/tux-shine-server-big-S-32.ico

Stay relaxed and success will follow

The Bill Murray quote on site is a good one, really makes you think. The site is great, lots of useful content about Linux and the Raspberry Pi, plus OpenWRT router mods. I have learned a lot ( and used it ) by skimming through his posts. He also has some interesting quotes like the Bill Murray one below.

Bill Murray on staying relaxed and how success will follow:

http://www.circuidipity.com/20140918.html

 

 


 

Automatic Server Status Page Creation Update

In January 2015 I created a post about automatically creating a status page for a Linux server that I have. Typically this is put under a restricted directory and allows you to see a snapshot of what is happening with the server. I run it by putting the scripts in the /etc/cron.hourly directory on a Linux PC and a Raspberry Pi running Linux.

It serves as a simple way to check up on the server without having to use a tool such as Webmin that requires a login. It also keeps a trail of log files that get rotated on a monthly basis, so there is always a few old ones around to track down any problems and patterns in the operation.

I have found this information useful when I have traced down malfunctions that can occur when setting up a server and also when I was trying to get a webcam up and running and had the USB bus hang up a few times when the cam was overloaded with too much light.

In the new script file I fixed a bug by adding parenthesis around a line that I was trying to echo and I added code to run the w command to show a quick picture on who is logged in, how long the server has been up and running and the values for the average load on the server at the 1, 5 and 15 minute marks.

Logcreate Script

#!/bin/dash
# Remove old log
rm /var/www/status/log.txt
# Print logged outputs into new log.txt
date >> /var/www/status/log.txt
echo >> /var/www/status/log.txt
tail /var/log/syslog >> /var/www/status/log.txt
echo >> /var/www/status/log.txt
free >> /var/www/status/log.txt
echo >> /var/www/status/log.txt
df -h >> /var/www/status/log.txt
echo >> /var/www/status/log.txt
# Top memory using processes http://www.commandlinefu.com/commands/view/3/display-the-top-ten-running-processes-sorted-by-memory-usage
#ps aux | sort -nk +4 | tail >> log.txt
echo "USER       PID %CPU %MEM    VSZ   RSS TTY      STAT START   TIME COMMAND" >> /var/www/status/log.txt
ps aux | sort -nrk 4 | head >> /var/www/status/log.txt
echo >> /var/www/status/log.txt
# Logged in User info using w command
w >> /var/www/status/log.txt
echo >> /var/www/status/log.txt
echo >> /var/www/status/log.txt
# Copy log.txt into the full log that is collected
cat /var/www/status/log.txt >> /var/www/status/fulllog.txt
# Create a free standind copy of the process tree
pstree > /var/www/status/pstree.txt

Alternate Version

I also created a version of the script for a desktop Linux PC that does not have Apache installed.  In it I use a DIR variable to contain the directory that I want the log.txt file stored.

 #!/bin/dash

# User defined variables
# No trailing / on DIR!
DIR=/home/erick/status

# Remove old log
rm $DIR/log.txt
# Print logged outputs into new log.txt
date >> $DIR/log.txt
echo >> $DIR/log.txt
tail /var/log/syslog >> $DIR/log.txt
echo >> $DIR/log.txt
free >> $DIR/log.txt
echo >> $DIR/log.txt
df -h >> $DIR/log.txt
echo >> $DIR/log.txt
# Top memory using processes http://www.commandlinefu.com/commands/view/3/display-the-top-ten-running-processes-sorted-by-memory-usage
#ps aux | sort -nk +4 | tail >> log.txt
echo "USER       PID %CPU %MEM    VSZ   RSS TTY      STAT START   TIME COMMAND" >> $DIR/log.txt
ps aux | sort -nrk 4 | head >> $DIR/log.txt
echo >> $DIR/log.txt
# Logged in User info using w command
w >> $DIR/log.txt
echo >> $DIR/log.txt
echo >> $DIR/log.txt
# Copy log.txt into the full log that is collected
cat $DIR/log.txt >> $DIR/fulllog.txt
# Create a free standing copy of the process tree
pstree > $DIR/pstree.txt

Rotation of Log

In the /etc/cron.monthly directory I have created a file that is called status-log-rotate and it will save backup copies of 2 months worth of the full concatenated server status logs.

#! /bin/bash
DIR=/home/erick/status
mv $DIR/fulllog.txt.1 $DIR/fulllog.txt.2
mv $DIR/fulllog.txt $DIR/fulllog.txt.1

Tweaks for Raspberry Pi

For the Raspberry Pi which has an SD card that I am trying to be conscious of writing to often. I have recently made some modifications to put the /tmp folder onto RAM using tmpfs. I create the hourly log underneath a folder there. Daily via a script it cron.hourly it gets concatenated into a daily log which is under a status folder that has restricted access. This gets appended once per day to the fulllog which actually lives on the SD card. The end result, no multiple hourly writes to the log file, just one append to the full log per day. The only downside is if the power drops and then some log entries will be lost for the day.

Logcreate runs from /etc/cron.hourly for Raspberry Pi

#!/bin/dash
# Set DIR, on Pi this is a temp location for log
DIR=/tmp/web

# Set fixed DIR FIXDIR for files that have to be stored on SD card
# Nevermind, just make a daily log and then copy that to the full log daily.
#FIXDIR=/var/www/status

# Remove old log

rm $DIR/log.txt
# Print logged outputs into new log.txt
date >> $DIR/log.txt
echo >> $DIR/log.txt
tail /var/log/syslog >> $DIR/log.txt
echo >> $DIR/log.txt
free >> $DIR/log.txt
echo >> $DIR/log.txt
df -h >> $DIR/log.txt
echo >> $DIR/log.txt
# Top memory using processes http://www.commandlinefu.com/commands/view/3/display-the-top-ten-running-processes-sorted-by-memory-usage
echo "USER       PID %CPU %MEM    VSZ   RSS TTY      STAT START   TIME COMMAND" >> $DIR/log.txt

ps aux | sort -nrk 4 | head >> $DIR/log.txt
echo >> $DIR/log.txt
# Logged in User info using w command
w >> $DIR/log.txt
echo >> $DIR/log.txt
echo >> $DIR/log.txt
# Copy log.txt into the full log that is collected
cat $DIR/log.txt >> $DIR/dailylog.txt
# Create a free standing copy of the process tree
pstree > $DIR/pstree.txt

dailylog-to-fulllog script, runs from /etc/cron.daily

#! /bin/bash

DIR=/tmp/web
FIXDIR=/var/www/status

echo "----------------------------------------------" >> $DIR/dailylog.txt
date >> $DIR/dailylog.txt
echo "----------------------------------------------" >> $DIR/dailylog.txt
cat $DIR/dailylog.txt >> $FIXDIR/fulllog.txt
rm $DIR/dailylog.txt

Logcreate Output from Raspberry Pi

Below is what the logcreate script will output to the log.txt file on a Raspberry Pi that I have running as a web server.

Sun Jul 12 14:17:01 EDT 2015

Jul 12 13:47:51 raspberrypi dhclient: DHCPACK from 192.168.1.1
Jul 12 13:47:52 raspberrypi dhclient: bound to 192.168.1.17 -- renewal in 40673 seconds.
Jul 12 13:59:01 raspberrypi /USR/SBIN/CRON[28010]: (erick) CMD (aplay /opt/sonic-pi/etc/samples/guit_e_fifths.wav)
Jul 12 13:59:07 raspberrypi /USR/SBIN/CRON[28009]: (CRON) info (No MTA installed, discarding output)
Jul 12 14:00:01 raspberrypi /USR/SBIN/CRON[28013]: (erick) CMD (/home/erick/fswebcam/cron-timelapse.sh >> timelapse.log)
Jul 12 14:00:23 raspberrypi /USR/SBIN/CRON[28012]: (CRON) info (No MTA installed, discarding output)
Jul 12 14:01:01 raspberrypi /USR/SBIN/CRON[28022]: (root) CMD (/home/erick/bin/usbreset /dev/bus/usb/001/004)
Jul 12 14:01:02 raspberrypi /USR/SBIN/CRON[28021]: (CRON) info (No MTA installed, discarding output)
Jul 12 14:09:01 raspberrypi /USR/SBIN/CRON[28053]: (root) CMD (  [ -x /usr/lib/php5/maxlifetime ] && [ -x /usr/lib/php5/sessionclean ] && [ -d /var/lib/php5 ] && /usr/lib/php5/sessionclean /var/lib/php5 $(/usr/lib/php5/maxlifetime))
Jul 12 14:17:01 raspberrypi /USR/SBIN/CRON[28064]: (root) CMD (   cd / && run-parts --report /etc/cron.hourly)

             total       used       free     shared    buffers     cached
Mem:        445804     424488      21316          0     106768     260516
-/+ buffers/cache:      57204     388600
Swap:       102396          0     102396

Filesystem      Size  Used Avail Use% Mounted on
rootfs          6.3G  3.1G  3.0G  51% /
/dev/root       6.3G  3.1G  3.0G  51% /
devtmpfs        214M     0  214M   0% /dev
tmpfs            44M  240K   44M   1% /run
tmpfs           5.0M  8.0K  5.0M   1% /run/lock
tmpfs            88M     0   88M   0% /run/shm
/dev/mmcblk0p5   60M   19M   41M  32% /boot

USER       PID %CPU %MEM    VSZ   RSS TTY      STAT START   TIME COMMAND
root      2071  0.0  3.0  24896 13652 ?        Ss   Jun28   2:24 /usr/sbin/apache2 -k start
www-data 27745  0.0  1.5  25412  7084 ?        S    09:58   0:00 /usr/sbin/apache2 -k start
www-data 27744  0.0  1.5  24960  6760 ?        S    09:58   0:00 /usr/sbin/apache2 -k start
www-data 27743  0.0  1.5  25428  7116 ?        S    09:58   0:00 /usr/sbin/apache2 -k start
www-data 27742  0.0  1.5  25396  7036 ?        S    09:58   0:00 /usr/sbin/apache2 -k start
www-data 27538  0.0  1.5  25396  7032 ?        S    06:25   0:00 /usr/sbin/apache2 -k start
www-data 27502  0.0  1.5  25404  7036 ?        S    06:25   0:00 /usr/sbin/apache2 -k start
www-data 27501  0.0  1.5  25396  7044 ?        S    06:25   0:00 /usr/sbin/apache2 -k start
www-data 27747  0.0  1.3  24936  6188 ?        S    09:58   0:00 /usr/sbin/apache2 -k start
www-data 27746  0.0  1.3  24936  6188 ?        S    09:58   0:00 /usr/sbin/apache2 -k start

 14:17:02 up 14 days, 12:56,  1 user,  load average: 0.00, 0.01, 0.05
USER     TTY      FROM             LOGIN@   IDLE   JCPU   PCPU WHAT
erick    pts/0    192.168.1.5      14:04   10:39   1.70s  1.70s -bash
Bonsai Leaves

Bonsai Comes Back to Life

I had a Bonsai tree that was neglected at times over the winter when we were away from home. It was not getting enough water, dried out and lost all of it’s leaves. I had a few other plants die off but most of them are easily replaced from commonly obtainable seeds. This Bonsai however was a plant that I had for about 10 years and it was disappointing to lose it.

Bonsai ready to be junked
Bonsai ready to be junked

In early May, I pulled it out of it’s pot and put it on a pile to be discarded. Luckily the place where it was put was in the shade. Plus this year (2015) we have received an exceptional amount of rain through June, more than double the average. All of this helped to keep it alive at some low level, even without soil.

It lives for 2 months, no soil, no care.
It lives for 2 months, no soil, no care.

In mid-July, it started to get a few little green leaves on it. What a miracle, it had some life left in it after all! Now it is back in the pot and I will eventually see how many branches are alive.

Rescued Bonsai
Rescued Bonsai

This is the second plant that has come back from the dead this year, the Bonsai along with a Ginko Biloba Tree that came back from the dead.

Rescued and repotted!
Rescued and repotted!
Tiny new leaves on the Bonsai
Tiny new leaves on the Bonsai

 

Automating PHPList Process Queue

Frequently when using the non command line version of PHPList which I have loaded on my service providers server, I have to hit Process Queue multiple times for a message. I have read a lot of posts online complaining this issue. I have done some troubleshooting of it myself by turning on the verbose feature in PHPList and I am still at a loss as to what it gets stuck on and then stops processing the queue.

PHPList works great, I love it for all that it does and does well. My one complaint is the Process Queue issue and from reading a bunch of posts online I have successfully patched together a solution.

This code can be executed on the server where PHPList lives if they will let you run a wget command from a CRON entry, some won’t. In my case I opted to simply run the command from a Raspberry Pi server that I have that runs 24/7. It is simple a looping script that does the equivalent of hitting the “Process Queue” button on PHPList at regularly timed intervals and detected when the queue has been processed. I have seen several examples of using wget to be fired off every 5 minutes to do this, but that means it gets run every 5 minutes, all day, every day. Why not run something that can be scheduled for a time and then runs to completion which this script does.
This script also produces a log file that can be copied to a location viewable via the server so you can double check to see if all ran OK. I see this mostly as a debug feature and it could potentially be turned off along with some of the “echos” of text. Except for the one in the else-if, it is an error to have an empty else-if in a bash script.

PHPList Automatic Process Queue Script

#!/bin/bash
 # Controller Script for PHPlist Process Queue
 # Erick Clasen blog.oils-of-life.com 05/16/2015
# User defined variables
 # No trailing / on DIR!
 DIR=/home/erick/php-proc
 LOG=php-pq.log
 # Time to sleep between loop iterations.
 TIME=3m
 # Max number of loops, so the script breaks out if for some reason it fails to complete.
 LOOPS=100
 # Publish a copy of the log to the following directory
 PUBDIR=/home/erick/public_html
# wget having trouble with some options so simply switch to the dir we want it to work in.
 cd $DIR
x=1
 echo Start------------- >> $LOG
# While loop, breaks out if done or at LOOPS as a guard against #infinite loop.
 while [ $x -le $LOOPS ]; do
# Timestamp in DDMMYYYY_HHMM-SS (UTC)
 timestamp=$(date -u +"%d%m%Y_%H%M-%S")
echo $x $timestamp
 echo $x $timestamp >> $LOG
x=$(( $x + 1 ))
# Use wget to run process queue with login and password provided
 wget 'your_url_here.com/phpList/admin/?page=processqueue&login=phplist&password=your_password_here'
# Test to see if PHPlist process queue has completed.
 if grep -q "Finished, Nothing to do" $DIR/index.*
 then
 # Exit from Loop with a break, mark log,remove index files that wget fetched, break out.
 echo All Done! Breaking Out
 echo All Done!--------- >> $LOG
 echo ------------------ >> $LOG
 # Publish Log, Optional
 cp $LOG $PUBDIR/$LOG
 rm $DIR/index.*
 break
 else
 # Nothing Happens
 echo Keep Running...
 fi
# Sleep for TIME
 echo ------- Sleeping Process --------
sleep $TIME;
done;

PHPList Automated Process Bounces Script

Process Bounces can be run from a script as well…

#!/bin/bash
 # Controller Script for PHPlist Process Bounces
 # Erick Clasen blog.oils-of-life.com 05/16/2015
# User defined variables
 # No trailing / on DIR!
 DIR=/home/erick/php-proc
 LOG=php-pq.log
# Publish a copy of the log to the following directory
 PUBDIR=/home/erick/public_html
# wget having trouble with some options so simply switch to the dir we want it $
 cd $DIR
# Use wget to run process queue with login and password provided
 wget 'your_url_here.com/phpList/admin/?page=processbounces&login=phplist&password=your_password_here'
 echo ---------------------------- >> $LOG
 echo BOUNCES HAVE BEEN PROCESSED! >> $LOG
 echo ---------------------------- >> $LOG
 # Publish Log, Optional
 cp $LOG $PUBDIR/$LOG
 rm $DIR/index.*

CRON entry to run these scripts

These entries run Process Queue every Monday, Wednesday and Friday (1,3,5) at 4:15AM and Process Bounces at 12:15AM on the first day of the month. Nothing magic about those settings, you can set them any way you wish. I figure starting the process in the wee hours of the morning will allow people to receive the email by the time that they wake and first check their mail. I set the time to 15 after so it won’t run at the same time as some processes that I have set to run on the hour. Sending mail on weekdays keeps people from having to deal with more email than they might like on the weekends.

15 04 * * 1,3,5 /home/erick/php-proc/php-pq.sh
 15 00 1 * * /home/erick/php-proc/php-pb.sh

PHPList Automatic Process Queue Override

If you must make PHPList process the queue now and don’t want to wait for the CRON task. Two options are available, do it the via the normal way from the PHPList admin website or you can use something as simple as a CGI script to force the php-pq.sh script to execute. Below is an example of a dirt simple CGI script. It prints out the tail of the php-pq.log and runs the php-pq.sh script. Php-pq.log is the log of the previous automatic processing event and then goes ahead and fires off the Process Queue Override by running the script. Basically the act of loading the page makes it run. I keep my cgi-bin directory locked down so no one outside of a few IP addresses can get to the files there. But it would be fairly harmless if someone did, the worst that happens is that the queue gets processed.

CGI Code for Automatic Process Queue Override

#!/bin/bash
 echo "Content-type: text/html"
 echo ""
 echo "PHPList Automatic Process Queue Override"
 echo ""
 echo "
PHPList Automatic Process Queue Override from host $(hostname -s &)
"
 echo ""
 echo "
 $(tail /home/erick/php-proc/php-pq.log &)
"
 echo ""
 echo "
Information generated on $(date &)
"
 echo ""
 echo "Processing..."
echo "
 $(/home/erick/php-proc/php-pq.sh &)
"
echo "
"

 

PHP Config File Tweaks

I am working on taking some notes on some changes I have made to the PHPList config.php file and will be posting on this in August 2015.

Ginko Biloba June 2015

Ginko Biloba Tree Comes Back From the Dead

Ginko Biloba Shoots
New Ginko Biloba Shoot

I have a Ginko Biloba tree that was planted in approximately 2009 from a seedling and it was pulled out of the ground in fall-2013 while moving. It sat indoors over the winter, not a good idea for deciduous trees as they need to go through the cold of winter to set their clocks for the Spring budding. It was watered off and on when it was dormant that Winter, but it was mostly dried out feeling. In the Spring the buds were dry and the branches turned pretty hard and dead feeling, snapping off easily. I planted it in the ground in late April 2014 and it sat there dead looking for the entire season.

I decided not to give up on it and pull it out as I have heard that sometimes deciduous trees will come back the following year. The winter of 2014-2015 was particularly harsh, one of the coldest in 100 years with the temperatures reaching up to -28 F below here. In the Spring, still nothing, but I waited and waited to see if anything would happen and if it didn’t I would plant a tree in it’s place in 2016. I was already considering where I might get another small Ginko Biloba tree.

2015 provided a hot May and a wet June, so the ground was good and warm early and had plenty of moisture. One day in mid-June I was mowing the lawn and cutting close to a lot of obstacles such as the tree,  when I stopped dead in my tracks as I noticed what appeared to be weeds growing around the base of it. But, something looked different about the “weeds” and I stopped just in time as to not mow them down. Sure enough it was actually new growth sprouting near the base of the Ginko tree.

For this year I will let the new growth run wild. Leaving the dead stick part of the plant as a marker so I don’t mow or step on the shoots coming up. Next year I will pick the best shoot and let it thrive, cutting the dead growth and the runty shoots that may appear.

It is amazing how nature surprises you just when you least expect it. One of the oldest species of tree the Ginko Biloba has a few tricks up it’s sleeve for survival after all.

Ginko Biloba
Ginko Biloba: June 2015
hungarian-pancakes

Hungarian Pancakes

Inspired by Diners, Drive-ins and Dives, episode aired February 24, 2014, between 11-12PM EDT.

I have made pancakes with potatoes and onions before but, watching this episode made me experiment with paprika, garlic, old bay. Plus I generally like to put a little bit of curry and or turmeric in too, it helps stabilize the potatoes as they tend to oxidize quickly once chopped. I have noticed red potatoes are better in this regard; they take a long time to discolor.

4 medium size potatoes
1 small onion
2 clove garlic
2 heaping tbsp white flour
1 egg, I used a large one
1 tsp turmeric or curry, more if you like
1 tsp baking powder
Oil or fat for the fan frying. I used coconut oil and bacon fat.

The following to taste
Salt
Paprika
Old Bay

1. Cut up the potatoes and onions, add in garlic cloves and run through a food processor on chop mode.
2. Dump into a bowl and add salt, mix.
3. In a second bowl add flour and other dry ingredients, mix thoroughly.
4. Start preheating pan, put the oil and fat in it. This is not deep frying, so coat the pan but coat generously.
5. Add egg to potato, onion and garlic bowl and mix thoroughly.
6. Add the premixed dry ingredients in slowly while mixing in. Slowly prevents lumps of flour from forming.
7. When the pan is hot enough, like when a drop of water will sizzle in it; drop in a serving spoon sized amount of the mixture and flatten it out a bit with a spatula.
8. After a few seconds, when it has set, shimmy the pan around so the oil can get under it and all around the edges.
9. When it is golden brown on the bottom flip it over and get the other side golden brown. This requires a guess, based on size the time may vary. The first few might get flipped over a few times until you get the timing down.
10. When done, place on a plate with paper towels to absorb excess oil. I like to keep them in the over on warm until I am done with the whole batch.

These pancakes are good as a side to many dishes and are a different way to use potatoes. The DDD episode shows them topped with goulash which looked great. The first time I made them I used them as “buns” for a nice grass-fed beef hamburger (Wegmans has great grass-fed beef). It was a great tasting and filling too.

potato-pancakes
Hungarian Pancakes

Hourly Chime for Linux and Mac

It is easy to set up a simple CRON job to run a sound on the top of the hour by running aplay on a Linux machine. Something like this would also work on a Mac with minor changes, afplay is the default command line player for Mac, CRON works the same. For Windows, I have not tried it but task scheduler, every hour and there must be some easy command line program out there to fire off, that stuff has been around since DOS.

aplay

aplay works with wave files so you can use oggdec to convert ogg files to wav. A lot of sound theme files come in ogg or wav. aplay and mplayer come installed in Ubuntu at least by default in 14.04 LTS, which I am running. If not a simple…

 sudo apt-get install aplay

…or…

sudo apt-get install mplayer

…will get them installed.

oggdec

Oggdec is part of a very small install package, takes seconds to install.

To install…

sudo apt-get install vorbis-tools

To convert OGG ausio file to a WAV audio file…

oggdec filename.ogg

Sound Themes

The sound themes are located at /usr/share/sounds . If you go there and try out the sounds you might find one that sounds good to you for an hourly chime.

Two level tree of /usr/share/sounds, using the tree command. If you don’t have it get in a few seconds using…

sudo apt-get install tree

Output of tree command show 2 levels below /usr/share/sounds

(tree /usr/share/sounds -d -L 2)

/usr/share/sounds
├── alsa
│   ├── Front_Center.wav
│   ├── Front_Left.wav
│   ├── Front_Right.wav
│   ├── Noise.wav
│   ├── Rear_Center.wav
│   ├── Rear_Left.wav
│   ├── Rear_Right.wav
│   ├── Side_Left.wav
│   └── Side_Right.wav
├── fLight__2.0
│   ├── Copyright
│   ├── index.theme
│   └── stereo
├── freedesktop
│   ├── index.theme
│   └── stereo
├── Fresh_and_CLean

Sound Theme Downloads

I went to a site see link below and downloaded two sound themes (fLight 2.0 and Fresh and Clean, the third one on the site was a dead link ) and tried out the sounds. I found that the Message sound in the fLight 2.0 theme was a pleasant but catchy enough sound to be heard at a distance and over any music I might be playing at the time the CRON job runs.

http://www.ubuntuvibes.com/2010/08/3-awesome-sound-themes-for-ubuntu.html

The CRON job that runs to do the hourly sound is..

00 09-23 * * * aplay /usr/share/sounds/fLight__2.0/stereo/Message.wav

It will produce a sound from 9AM to 11PM and uses the Message.wav which I converted from an ogg to a wav file…

I have attached the Message.wav below for your listening pleasure!

 

 

So far I have not switched my overall sound theme from the Ubuntu default, but I might try out the two themes that I have downloaded for variety.

Resources

Create Cron Jobs on a Mac

Command Line Audio Player on a Mac