Startup 8: Stardate.Today

For years, I have calendared prolifically. (Is that a word?) I track all the granular details: my to-do list is on one calendar; all my classes and homework; my social life; gym, yoga and exercise; and previously my jobs at Tech 2u and Starbucks. Each of these topics is on its own color-coded calendar within Google Calendar. You can take a look at what my weeks look like at cjtrowbridge.com/calendar.

There is a problem though. Sometimes I wonder what I accomplished on a particular day off or with my free time after work. I go back in my calendar to check, and there is nothing there; just a blank spot where I neglected to note the day’s events. It’s hard to quantify how well I use free time or time at work when I don’t have any record of what I accomplished. What if I could make like Janeway and just shout at the wall like it’s my journal?

Journaling always sounded interesting, but there’s no way I am going to lug around paper and a pen to write in it, and even if I did, it wouldn’t be searchable.

Enter Stardate.Today. This simple tool lets me type out my stream of consciousness into private posts, and then adds them to a timeline for me to search back through. They are also added to my calendar so I can see when each thing happened.

It takes just seconds to get started. Simply log in with google and you will be given a link which you can enter into your phone or any calendar tool. On the homepage, you are presented with a simple box to type in and a list of your past posts. When you enter a post, it is automatically added to your calendar.

Simple as that.

[Working Draft] Smarter Sockets

This is something I looked long and hard for, before deciding to build it myself. Also, a school project pulled this off the shelf and added some urgency. Last year I completed a similar but very early proof-of-concept prototype just for the web-controlled relay board. It didn’t have the parallel shift registers, the relay board, or the actual sockets. I never finished it or came close to building an actual final product.

[Take a better picture]

This prototype built on an Arduino controlled relay board which in turn drove a series of electrical sockets controlled by a web page over an Ethernet connection. [Feature link to commits where I shared this with the Arduino community.] But it also let’s you monitor consumption and prevents outages with a battery backup.

This project combines these things;

There is not currently any product that does this which I have been able to find. And as a minimalist who is committed to conserving energy, being independent, and being aware of my footprint, I was really hoping to find something like this out there.

Improving on Kill-A-Watt

Kill-a-watt is a very interesting product which nevertheless has some huge development opportunities.

It lets you see power usage for a single electric socket. There are several problems with this.

For one, it only works for one socket.

Two, it doesn’t let you see the data except through the limited interface. There are no graphs of usage over time and there is no web accessibility for the data.

Expanding on this idea, each socket on my new device will allow you to see power consumption over time. There will be a clean web-based interface which lets you see any unusual spikes and be responsible with your energy consumption. This will pair well with the smart-socket feature which will enable you to turn things off when you are not using them or when they are using too much energy.

Improving on Smart-Sockets

Smart sockets are very limited. They typically offer only one socket, and do not offer usage metrics. Also, they feature very poorly implemented security and control software.

[Explore Steve Gibson’s IoT security reviews]

Improving on these widespread industry problems will be an important and valuable step.

At the hardware level, offering multiple smart-sockets is already a huge improvement, as is offering usage metrics, but there is room to improve further. Another major feature of this project is feature granularity. I want to make sure and give enough detail so that developers can create multiple physical formats. Maybe you only want one socket, not eight. Why not?

There is no reason this system can not fit into a wall socket and replace the old-fashioned ones you already have. Imagine removing the mess of plugging devices into devices into the wall, and just put the smart-socket inside the wall.

[include diagram]

Including a UPS (Might take this out)

I started with this very thorough tutorial which does a great job of explaining the terms and options that differentiate existing UPS products.

I found a discarded 2kw UPS with a dead logic board at a computer repair shop which I was able to get for free. All 2 kilowatts of sealed lead-acid batteries worked fine, it was just a bad control board. :] This was exceptionally lucky, but you may be able to find something similar if you look.

I had also explored scavenging 18650 battery cells. These are very popular for DIY UPS builders. This great alternative option would also probably scale better than sealed lead-acid and charge or discharge much faster. There are lots of places like battery stores that will happily give you free, “dead” laptop batteries full of these cells. Typically it is just the control board and maybe one or two of a half-dozen cells which is actually bad. The rest will still work fine in most cases.

Choosing a UPS Paradigm

The linked tutorial describes three main types of UPS. I chose the Online type, “The Online UPS unit completely isolates the devices attached to it from the wall power. Instead of jumping into action at the first sign of power out or voltage regulation issues like the Standby and Line-Interactive units, the Online UPS unit continuously filters the wall power through the battery system. Because the attached electronics run completely off the battery bank (which is being perpetually topped off by the external power supply), there is never a single millisecond of power interruption when there is power loss or voltage regulation issues. The Online UPS unit, then, is effectively an electronic firewall between your devices and the outside world, scrubbing and stabilizing all the electricity your devices are ever exposed to.”

If it’s more expensive, why choose this type?

The goal of this project is radical energy independence. I want this to be expandable and compatible with eventual solar or wind power generation. This basically fills the same role as a Tesla Powerwall.

For reference, I found a great online community focused on cloning the Tesla Powerwall. There are lots of great ideas and examples in there.

Future Direction

There are tons of potential directions the project could go. For example, the smart-sockets could easily include powerline-wifi-adapters to replace wifi access points and greatly increase the wifi availability in your home while eliminating obnoxious and unnecessary hardware and wires.

Methods

This project includes a public repository of all the code and plans which anyone can contribute to, and a forum for discussing it. I will try to make it fairly modular and platform-agnostic so that people can use different hardware and still have a safe and secure system.

Building a MVP/POC

 

The core of the project is an Arduino/compatible processor and a network stack. This can be Wifi or Ethernet. There are some great examples which combine these parts together and even include the relay board if you would rather use that.

I will be using the brand-name Arduino Uno and the Arduino Ethernet Shield, only because I already had these. If I was buying them for this project, I would probably use the one I linked to in the previous paragraph because Wifi would be a great feature for this project.

This is going to take a lot of different parts which need to connect to the Arduino. We need to use a parallel shift register (Explanation) in order to control lots of things with just a few pins.

A simple relay driver board does the heavy lifting of turning the sockets on and off.

Measuring the current through each socket will require a series of special sensors wired inline and then connected to the Arduino. Alternatively, there are several other examples I am exploring for this part.

The main power will come from the battery bank and go through a cigarette lighter dc-ac converter before hitting the gfci socket and then the relay board. This makes the whole thing very safe because there are several circuit breakers built into each of these levels.

The battery bank will be charged by a standard ATX PC power supply which will automatically be turned on and off by the Arduino when the power level requires it. (This means we only get seven sockets since one of the eight relays will control the charging supply.

Future Steps

The most obvious future step would be making an actual ready-to-order product which people can buy. This would require a great deal of funding since there would be regulatory requirements and manufacturing costs, but I really think this is something people would buy.

If people are willing to pay $30 or more for only a single smart-socket which does not measure usage, it makes a lot of sense that people would be willing to pay even more for more features and expandability in a device which offers multiple sockets with valuable metrics about usage.

Startup 7: Top Story Review

This is part of a series on Building 12 Startups in 12 Months.

This is number seven: TopStoryReview.com!

Black-Box News is Bad

If you look at the news on Google or Facebook , you will see a few stories which some mysterious algorithm has selected for you. Are these an accurate reflection of current events? No. These stories are often selected to confirm your biases based on your activity and search history. You are seeing the echo-chamber your digital context has created for you, because that is how these companies maximize for your attention in order capture your attention.

Facebook and Google use black-box algorithms to pick what you see. This means that not only is there no explanation of how or why your stories were picked, but there is no way even for the engineers to reverse-engineer the algorithm and see how or why it picked the stories it did.

It is very common to see stories featured as trending on Facebook which are false, or shared through other services. This effect has led to people doing horrible things based on false information presented as news by algorithms, like shooting up a pizza place or threatening and harassing the families of murdered children.

This is a problem for democracy and a problem for all of humanity. There has been much speculation that this has been a major contributing factor in the recent rise of populism in America and the results of the recent presidential election. Everyone should have access to concise, accurate snapshots of current events. My frustration with the lack of quality and lack of transparency in news aggregation services today led me to create an open-source alternative which omits biases and individual context.

But How?

This project expands on an experiment I started in high-school. I was trying to aggregate various high quality news sources and then determine what major themes were trending, and present that information in a useful way.

Back then, I started with fetching the “Top-Stories” or “Breaking News” RSS feeds from a few dozen newspapers around the US, and combining them all into a MySQL table and then doing word counts to determine which words stood out, and classifying those in order to build a list of general topics, then I displayed the most recent or most reputable source’s story for each of the topics.

There are several big innovations I have come up with over the years which I can now incorporate into this idea in order to maximize value.

The final product will be simple homepages for lots of topics with a few bullet-points, imparting a concise and accurate representation of the current state of events.

My Open Algorithm

  1. Pull in thousands of rss feeds from an open list of high-quality news sources all over the world.
  2. Analyze the stories to find trending topics using my open condensr algorithm.
  3. Get all the stories relating to each topic.
  4. Condense all of the stories on each topic down to just one sentence.

Step 4 expands on the work of groups like SMMRY and Reddit’s autotldr bot. My new Condense algorithm can summarize thousands of pages of text into just a sentence or two. This is obviously not perfect, but it is surprisingly good, and there is always room to improve later.

Structure

The basic structure of the site will be a home page which combines all topics, and then lots of individual pages for each category. The home page and each topic page has a bullet-point list of a few trending stories with one sentence summaries. These bullet-point summaries will eventually link to a story page with a longer summary along with links to recent reporting from various sources.

The algorithm runs every hour, and maintains an archive of all the pages, so you can also look back at what was happening at any certain time.

In effect, I am building a massive pipeline that takes in much of the world’s reporting and produces high quality condensed content which is much easier to absorb.

Making Money

There will be an enormous amount of content created hourly, and that means lots of organic traffic. SEO and social integration will be critical. I will eventually include ads to monetize the content.

Perfect Server, Version 18

This is the latest iteration of my perfect server. I am building this in order to consolidate and deprecate previous server inventory. Also, it includes many new best-practices which should further secure this new server.

 

The first step is to provision a new server. I use digital ocean. I will be logged in as root for all of this since this is all stuff that needs to be done as root. If you don’t want to log in as root, you can instead use sudo at the beginning of each command.

Now, add some sources to the package manager. Get there with;

nano /etc/apt/sources.list

Add these repositories;

deb http://ftp.debian.org/debian jessie-backports main

deb http://packages.dotdeb.org jessie all

We also need to add the GPG keys so the new repositories will work. Run these commands;

gpg –keyserver keys.gnupg.net –recv-key 89DF5277
gpg -a –export 89DF5277 | sudo apt-key add –

Update the package manager and upgrade any packages that are available;

apt-get update && apt-get upgrade

Now install all the packages we will need;

apt-get -y install fail2ban apache2 php7.0 php-pear php7.0-mysql php7.0-mcrypt php7.0-mbstring libapache2-mod-php7.0 php7.0-curl screenfetch htop nload curl git unzip ntp mcrypt postfix mailutils php7.0-memcached mysql-server && apt-get install python-certbot-apache -t jessie-backports && a2enmod rewrite && service apache2 restart && mysql_secure_installation

You will be prompted to create a mysql password which you will then immediately be asked for when configuring mysql server securely.

Name Thyself

Now navigate to the virtualhost directory;

cd /etc/apache2/sites-available

Remove the default ssl virtualhost. We will be creating a new one instead.

rm default-ssl.conf

Rename the default virtualhost to the fqdn of the server. Example: server3.website.com. Note that this is not the fqdn of the site(s) we are hosting on the server.

mv 000-default.conf [fqdn].conf

Edit the file and replace the admin email with your own. Change the DocumentRoot to /var/www instead of /var/www/html.

Now add the following block within the virtualhost tag of the file and save it.

<Directory “/var/www”>
AuthType Basic
AuthName “Restricted Content”
AuthUserFile /etc/apache2/.htpasswd
Require valid-user
</Directory>

Lock it Down

Let’s create a credential set for our new virtualhost. This is sort of a catch-all for any domains we point here which are not yet set up.

htpasswd -c /etc/apache2/.htpasswd [username]

You will be prompted for a password. This is very bruteforceable. My best practice is to use a very high entropy strings for both the username and the password. Typically at least 64 bits of random base 64 for each.

Apply Changes

We need to tell apache that we have changed the name of the default virtualhost file. First we disable the one we changed, and then enable our new one.

a2dissite 000-default

a2ensite [fqdn]

Now restart apache

service apache2 restart

Test our changes by navigating to the fdqn of the server. You should be prompted for a username and password.

Administrative Tools

We will need to put some tools in here so we can administer the server.

PHPMyAdmin

This will allow us to manage the databases we will be creating on the server. Head over to their website and get the download link for the current version.

Navigate to our new secure DocumentRoot directory and download that link.

cd /var/www && wget [link]

Now unzip it and remove the zip file we downloaded.

unzip [file] && rm [file]

Now that we have a PHPMyAdmin directory in our secure virtualhost, we need to configure it. Luckily it can do that itself! Use this command and enter the mysql root password when prompted.

mysql -uroot -p < /[unzipped phpmyadmin folder]/sql/create_tables.sql

The last thing PHPMyAdmin needs is a secret string. Edit the config file config.sample.inc.php and save it as nano config.inc.php.

Make sure to add a random string where prompted at the top of the file.

Postfix Outbound-Mail Server

We need to edit the config files for postfix and change the interface to loopback-only like so. We already set up a firewall rule to block connections to port 25, but those rules can be changed by mistake, so this will be a good second line of defense to prevent public access to sending mail through our server, while allowing us to still use it locally.

nano /etc/postfix/main.cf

Find this line;

inet_interfaces = all

And change to;

inet_interfaces = 127.0.0.1

Now edit the email aliases;

nano /etc/aliases

At the end of the file, make sure there is a line that starts with root and ends with your email, like so;

root: email@domain.com

Save the file and exit. Then run newaliases to let Postfix apply the changes. Restarting Postfix is not enough because we changed the interfaces line in the config file. We need to stop and start it like so;

newaliases && postfix stop && postfix start

Now our sites will be able to send emails!

VPS Home

This is something simple I built which serves as a better index page for the secure virtual host and includes several helpful tools for diagnostic purposes. To try it out, run this command from the DocumentRoot directory.

wget https://raw.githubusercontent.com/cjtrowbridge/vps-home/master/index.php

PHPInfo

It’s helpful to be able to access details of the server’s php installation from this directory. I like to create a file called phpinfo.php which contains simply

<?php phpinfo();

Automatic Backups

Create a new file called /root/backup.sh and add the following to it. Make sure to replace the mysql password with yours.

#!/bin/bash

#deletes old backups
find /var/www/backups/ -mindepth 1 -mmin +$((60*24)) -delete

#creates new backups
tar -cf “/var/www/backups/webs-$( date +’%Y-%m-%d_%H-%M-%S’ ).gz” /var/www/webs
/usr/bin/mysqldump -uroot -p[mysql root password] –all-databases | gzip > “/var/www/backups/mysql-$( date +’%Y-%m-%d_%H-%M-%S’ ).sql.gz”

Now edit the crontab with nano /etc/crontab and add this line. This will automatically run that script every day at 8pm.

0 20 * * * root /root/backup.sh > /dev/null 2>&1

Make sure to give the script permission to execute.

chmod 775 /root/backup.sh

Offsite Backups

Now that we have regular backups going, we need to regularly get them off the server. I like Bittorrent Sync for this. It is very fast and seamless.

Run these commands to install Bittorrent Sync;

sh -c “$(curl -fsSL http://debian.yeasoft.net/add-btsync14-repository.sh)”

apt-get update && apt-get install btsync

I recommend selecting the user and group www-data for btsync as this will greatly simplify administration and file permissions.

Then navigate the the port you set up during the installation process and create credentials. As always, be sure to use very high entropy credentials. Then create a shared folder for the backups, and copy the read-write key to your nas to securely copy the backups every day.

The really great thing about this approach is that when your cron job deletes the backups every day on the server, Bittorrent Sync will archive them on your nas so that they are always available. This means that at any point, you can simply take the automatically generated gz of your webs and mysql, and recreate your entire server in minutes.

Migrating Sites In

Move over the files for all the sites you want to host into individual directories in the /var/www/webs directory.

Now navigate to your virtualhosts directory.

cd /etc/apache2/sites-available

We created a default virtualhost file for the server and named it [fqdn].conf. This was the fqdn of the server, but not the sites it will host. Now we want to create our first hosted site. Copy the default file we made to create a new virtualhost like so…

cp [server fqdn].conf [site fqdn].conf

You can use any naming convention you like, but managing dozens or hundreds of these will become impossible if you are not naming them clearly.

Next, we need to add some new things to this hosted site fqdn. Add a new line inside the virtualhost tag like this;

ServerName [site fqdn]

And change the line which has DocumentRoot to point to the directory for this hosted site. For example;

DocumentRoot /var/www/webs/[site fqdn]

Lastly add these two blocks at the end of the file.

<Directorymatch “^/.*/\.git/”>
Order deny,allow
Deny from all
</Directorymatch>

<Directory /var/www/webs/[site fqdn]>
Options FollowSymLinks
AllowOverride All
Require all granted
</Directory>

The first block will prevent anyone from navigating into a git repository and accessing sensitive data like credentials or from cloning the repository.

The second block will allow htaccess files or directory rewrites, and prevent directory listing. These are required changes if you want to host WordPress sites, and best practices all around.

Now we just need to enable these changes and make the site live with;

a2ensite [site fqdn] && service apache2 restart

From this point on, this new virtualhost can be copied to create new sites, rather than recreating each one from the original virtualhost file.

Free SSL

We already set up LetsEncrypt so now we just need to run it. Once the domains are set up and pointed to the server’s ip, along with a virtualhost being configured, all it takes is running certbot which takes care of everything.

certbot

Startup 6: Exotic Weapons

Today, there are many fascinating examples of people using successful techniques to create passive income online.

Those of us who grew up coding and then took that perspective to business have a unique advantage. You might call it a super power.

We, the digital priesthood conjure passive income from the ether with the aid of new information technologies.

In this new blog, we will explore the ways in which engineers are the new pioneers and magicians, drawing the first maps of the digital frontiers we create and using esoteric knowledge to pluck money out of thin air.

All of the examples we discuss on Exotic Weapons will come with enough information to do it yourself. (And without the sales pitch.)

Startup 5: Condensr

This is a free tool based on several other tools I have seen online. It accepts long-form text and condenses it to the length you specify.

The code is very simple, and very powerful. I was shocked at how easy this was to build. Check it out on Github, or head over to the site and start condensing!

Next Steps

I have started initial development of an API, and I want to add a feed of things people have Condensed as well as a bookmarlet tool for condensing news articles.

How to Build a Free Linux Microsoft SQL Server

This covers how to create a virtual linux server running Microsoft SQL Server.

 

First, create a virtual server with the following requirements in mind.

  • Ubuntu 14.02 LTS (Server or Desktop)
  • At least two CPUs
  • At least 4gb RAM
  • At least 10GB HDD for the operating system
  • PLUS at least double the amount of space your databases will use

When you install Ubuntu, make sure to enable updates and third-party software, unless you’re the real DIY-type.

First, update the default installation packages;

sudo apt-get update && sudo apt-get upgrade

Now we need to install a few tools before we can get started;

sudo apt-get install cifs-utils curl

Install Microsoft SQL Server

Run these commands to install Microsoft SQL Server and its tools;

curl https://packages.microsoft.com/keys/microsoft.asc | sudo apt-key add –

curl https://packages.microsoft.com/config/ubuntu/16.04/mssql-server.list | sudo tee /etc/apt/sources.list.d/mssql-server.list

sudo apt-get update && sudo apt-get install -y mssql-server

sudo /opt/mssql/bin/mssql-conf setup

You will be prompted to create an SA or Server Administrator password. Use something with high entropy!

Now install tools;

curl https://packages.microsoft.com/keys/microsoft.asc | sudo apt-key add –

curl https://packages.microsoft.com/config/ubuntu/16.04/prod.list | sudo tee /etc/apt/sources.list.d/msprod.list

sudo apt-get update && sudo apt-get install mssql-tools unixodbc-dev

echo ‘export PATH=”$PATH:/opt/mssql-tools/bin”‘ >> ~/.bash_profile

echo ‘export PATH=”$PATH:/opt/mssql-tools/bin”‘ >> ~/.bashrc

source ~/.bashrc

Now, copy over your backup file and put it in /var/opt/mssql/data/

I have not gotten the cli tools to work for importing the backups. They seem to look for Windows paths. You will need to use MS SQL Studio to import the backup.

How To Create a Local Storage Repository on XenServer

I was working with a XenServer in a complex corporate network environment, and it was not possible for this server to access any samba shares, such as my laptop. I needed to put some ISOs on it, so I decided to create a local storage repository. This way, I would be able to simply -wget an ISO from the web, and then use it locally.

 

First, SSH into the XenServer and create a directory for the repository;

mkdir -p /var/opt/xen/LocalRepo

Then, tell Xen to create a XenServer Storage Repository at that directory;

xe sr-create name-label=LocalRepo type=iso device-config:location=/var/opt/xen/LocalRepo device-config:legacy_mode=true content-type=iso

Now move to the directory and then wget whatever ISOs you need…

cd /var/opt/xen/LocalRepo

 

wget http://releases.ubuntu.com/16.04.2/ubuntu-16.04.2-desktop-amd64.iso

Now you’re cooking with gas!

 

PS. Make sure to check your free space and make sure your ISOs will fit. This partition is not very big by default.

df -H

How to Start a Business For Free (With Examples)

I gave this speech for a public speaking class. It included a self-evaluation assignment which I share here;

 

CJ Trowbridge

2017-07-05

Sierra College Comms 1

Scott Kirchner

Demonstration Speech Self-Evaluation Assignment

In my speech, I demonstrated how to bootstrap a business. I gave examples from my experience bootstrapping a pizza business in Chico several years ago. I felt like the speech went very well. I shared the video online and received good feedback, and my peers seemed to feel that it went well based on their reactions during the speech and afterwards. (I used a special camera to record the speech which captures the audience as well as the speaker, so I was able to review their reactions.)

When I was composing the outline for the speech and rehearsing it, I tried to make it as relatable as possible. I made sure to include at least a few concrete examples whenever I discussed abstract ideas. I find this generally lacking in most entrepreneurial literature, so I think and hope that I improved on this frustrating trend. I feel like most people can relate to this topic if it is presented properly. For these reasons, I think the content was good.

My last-minute addition of a visual aid was also a really great touch. It was more than just the visual effect, or even the smell; it was visceral. I think it really grabbed attention, and it made the value-proposition of the content become a visceral feeling for the audience. Hunger is a limbic response, a deep emotional thing. It supersedes the prefrontal cortex and the trained analytic mind. This was a major underlying theme in my speech; take the product to the people who don’t know they want it, and make them want it. I demonstrated that without even talking about it. My clincher about how the audience could take the pizza into the quad right now and quadruple the money seemed to leave them with ideas about how they could implement the ideas I had discussed. Several audience members approached me about business ideas they had and how they might bootstrap them like I did. I think this part of the speech was very effective.

In general, I would say I was not very anxious about this speech. I have had a great deal of public speaking experience from a young age, BUT a big part of what little anxiety I did have was timing. I am not used to timed speeches. To alleviate this anxiety, I decided to include several quick stories in my concrete examples for each abstraction. Then, I could expand on the stories as required to get to the correct time. I think concrete examples were a good idea, but I think the stories went too long, and this was the one development opportunity identified by the professor, who said I “Squirrelled,” (or went on tangents or rabbit trails) in his remarks at the end of the speech. This had been a deliberate and strategic effort to fill time, but obviously it distracted from the content. I will try to expand on concrete details next time, or perhaps use a story as one of the major points, rather than trying to incorporate several into sub-points. Also, I should have defined the “unfamiliar” word bootstrap as soon as I first used it.

This implies a different structure would be better. Rather than enumerating abstractions and then providing concrete examples and stories, a better strategy might be to enumerate several abstractions and provide concrete examples only, then finish up with a brief story to tie everything together. This also means timing would be harder, and I will need a better strategy for making sure the time is correct. I think doing some sort of outline for the ending-story and then selectively condensing it would be a better strategy for getting the time correct.

Startup 4: CronPUT

This is part of a series on Building 12 Startups in 12 Months.

This is product number three: CronPUT!

CronPUT is a fairly self-explanatory name. Users log in with Google to sign up for an account and enter a list of webhook URLs, specifying an interval for each. At the specified time, a PUT request is sent to the URL. You can even see the result header code.

I maintain a dozen or so Linux servers. Keeping track of cron jobs for that many high-entropy webhooks is insane when they are all in different places and may or may not be working.

This product solves this problem by putting all your cron webhooks in one place.