School Workflow: Triple Full-Time

For the last year, I have been a student who consistently takes at least a double to triple full-time workload. I am not superman, but I do have a 4.0 GPA. I also spend a great deal of my time traveling and having vacations and adventures.

All of this has been an intensely complex effort, but several simple tools have made all the difference.

Three tools in particular have dramatically improved my ability to accomplish this enormous undertaking.

Rate My Professor

I hear a lot of criticism of this platform, and a lot of it is probably true, but it has never steered me wrong. I have a simple rule; I will not take any class where the professor has less than a 4/5 rating.

As we will see, this makes all the difference.


These days, every college is going to require their teachers to use a learning management system. Any decent professor is going to be able to understand and properly implement their LMS. Canvas has been great, though it’s not perfect. Probably most any modern LMS will work for my system.

An LMS is very helpful but not necessarily critical to managing a large workload.

Organizing The Data

I start by making a bookmarks folder for the semester. Then I put a link in it for each class. The links go to the grades section of each class. This way, I will skip pointless details and go straight to a list of any incomplete assignments whenever I check on a class.

Time Management

Now that it is easy to see all the tasks which I will need to complete, it’s time to put them all in one place. I start by creating an appointment in my calendar every monday morning called “Check for Coursework.”

A good rule of thumb is to start with the first week, and see how long it takes to go through all the classes and find assignments. I usually spend about an hour or two checking every class for any assignments and then entering everything. Once you know how long that will take you, create a recurring calendar event each week for this purpose.

Each class will also require some fairly consistent amount of time to complete all its coursework. I usually assume about two hours per class per week for completing assignments.

It is absolutely critical that time is set aside for each class. Problems and distractions are going to come up. If I ever need to move my coursework time to later in the week, I know how long it’s going to take and whether that’s going to be possible because I have created a block of time for each class.  Here’s what an average week this summer looks like;

In grey is all my classes and the time set aside for completing assignments.

In blue, my personal plans and events I’m going to.

In yellow, tasks I need to complete which are not at a specific time.

In red, I have time set aside for sleep. This is another important thing to remember. Getting enough sleep every night is critical to performing on this level. You’re no good to yourself or anybody else if you’re half asleep.


Maybe the most important step in my process is creating a fresh to-do list every week for school. I love Trello for this purpose, a great recommendation from Pieter Levels. It’s a great, free tool that lets you make to-do lists. It also features integration with IFTTT which will be super handy in a few minutes.

It’s Monday morning so this is what my Trello board looks like…


The first column is titled with the date on which the week starts. The second column contains anything I need to finish today. The third column contains any school tasks which I have not completed yet, but which need to be done this week.

In the past, I would go through all my classes and create entries in Trello for anything I need to finish this week. Then, I will use that time I blocked off in my calendar to work on each class until everything is done.



I love IFTTT. I use it for tons of home automation and other fun projects. But it’s not all fun and games. In this case, it can put all my coursework into my todo list for me!

The Canvas LMS is what both of my colleges are using right now, and it has a feature which most LMS will probably have. It gives you a URL which lets you add your assignment due-dates to your calendar. I added this URL to my Google Calendar, and then connected Trello and Google Calendar to IFTTT. Then it was as simple as creating this applet which automatically adds anything in that calendar to Trello as a new item in my “School” list. Here are the settings I used:

To summarize, any time any of my professors at either college creates an assignment, it is automatically inserted into my school to-do list on Trello.

Note that I left the “Description” field blank because it adds a lot of unnecessary extra information. I just want the name of the assignment and the due date.

With this incredible automation in place, I spend my coursework time on Mondays simply cleaning up complex assignment names and putting things in order of how I want to complete them. Here’s what my Trello board looks like after this process…

Final Thoughts

With these tools in place, I am able to take double and triple full-time workloads without worrying about missing things or whether I will have time to complete the work. I already know exactly how much time I will need and when that time fits into my week. I know everything that’s due and I rank it in order of when I want to get it done.

Probably a lot of the inspiration for this process came from reading Tim Ferriss, Pieter Levels, and others on similar processes for maximizing impact while minimizing work at startups.

It’s funny to me how much of our time we spend worrying about the workload instead of actually doing the work. When you eliminate that anxiety by building a foolproof structure around your work, it’s easy to maximize what you can get done.

Startup 12: Smart Mailer

I built a tool very similar to this one as an experiment to see how much money I could bring in to a previous workplace by sending automated emails based on various criteria. The answer turned out to be millions and millions of dollars. This was a very exciting surprise which largely led to the direction I decided to go with my career and my choice to quit that job and get my MBA so I can get funding and start a large project around this idea.

It would be very easy to accidentally use this product in a way which violates laws all over the world. You MUST research the legal requirements in your jurisdiction before using any part of this product, and make sure you have unambiguous and clear consent as required by law before sending any emails to anyone.

I accept no responsibility for the misuse of this code, accidental or deliberate by anyone.

That Time I Found Millions of Dollars Under The Cushion…

Lots of businesses have lots of data. They may not know how to access the data or how to analyze it, or even that it could be valuable to analyze it, but businesses keep track of things like what customers buy and when. They keep track of things like how frequently a customer uses their services. They keep track of things like which products have particular behaviors and lifecycles.

They can’t help but keep track of these things. The question if whether it is possible to extract value from this data.

You’re Customers Are Getting Older.

Customers age. In business terms, this sometimes means that a customer has not been seen in a while or has not purchased any products for some time. A previous employer of mine had all the contact information for these people, but no free labor to perform outreach.

The answer was obvious. Send all of those people an email every few months, automatically. And it worked spectacularly. The percentage of customers who disappeared each month dropped right away, and kept dropping as more and more customers converted for new services.

Your Product’s Are Getting Older.

Lots of products have life cycles. Nothing lasts forever. If it did, there wouldn’t be opportunities for businesses. Analyzing past customer data and behavior can illuminate likely durations after the purchase of particular products when a customer may need another product.

Maybe it’s the same thing they bought before. Maybe some kind of tuneup. Maybe it’s something complementary. Find those patterns and determine when it’s the right time to remind the customer that you’re thinking about them, and don’t be afraid to send them a coupon. I literally wrote an automated email for one such pattern which simply said, “We’re thinking of you. Here’s a coupon.” It was one of the most successful campaigns we ever tried.

At my last workplace, this larger strategy of analyzing trends and incentivizing them when they’re coming up brought in tens of thousands of dollars a month in new revenue at absolutely no cost whatsoever. You write the queries and emails once and then they send automatically every day.

Your Employees Are Imperfect.

Employees make mistakes. Sometimes they try to cheat and steal. It can be easy to find these kinds of things in the data and automatically alert supervisors so they can look into it. This saves an enormous amount unnecessary oversight and regular inspections and investigations to make sure procedures are followed correctly. Simply write a query which finds the problem, and then have it send an email to a manager.

This simple concept eliminated multiple entire management positions at my last workplace and allowed those managers to work on more important things like growing sales and developing the team.


The Future

There are a lot of exciting directions this product can go in the future. This concept within my various projects has been under constant development for almost five years in one form or another. I am very excited to be able to share this concept as a product and to give others the opportunity to find the missing millions under the cushions like I did.

Startup 11: PHP TTS Webhooks

Smart Homes Are Pretty Great

I have very much jumped on the bandwagon with smart home products. I have smart lights, a smart fan, a virtual assistant in every room; mine is a very smart house. I have also gotten very into IFTTT and related functional services which integrate with these smart devices to make them even smarter and more interconnected.

Example; my lights turn red at sunset automatically each day. This is actually a very exciting and new possibility which confers enormous health benefits. With cheap commodity products and running no servers or custom hardware or software, I can easily create triggers in IFTTT which pull the sunset time from weather services automatically, and then toggle my lights to red exactly at sunset each day. (Hopefully it goes without saying that sunset is at a different time each day.) Imagine trying to do this a decade ago. It would have been ridiculously complex. Just look at my Arduino smart plug project for example, where I do it from relative scratch.

A Problem

There is one thing that still is not possible with any of the popular virtual assistants and smart home devices such as Google Home or Amazon Echo.

I can talk to Google Home or Amazon Echo all day, and they are happy to acknowledge me and do whatever I ask of them. But there is absolutely no way to make them reach out to me in response to some external trigger.

Example; my Ring Doorbell detects motion in the front yard, and I have my phone on silent. I will not know until I next check my phone that motion was detected. There is absolutely no built-in way to make my army of virtual assistants notify me of something like this. I could buy Ring’s official plug-in chime (which I did) but this should not be necessary when I already have little computers in every room which should be able to just talk to me. Thinking about this problem gave me lots of other interesting ideas about things I would like to be notified about. For example problems at work, time-sensitive emails from teachers, or any number of potentially important issues.

A Solution

I have a whole box of Raspberry Pi’s from old projects. With one simple command, I can make them say anything I want. I just need to install Say;

sudo apt-get install say

Then I created a simple PHP script which accepts webhooks and calls the say command. This is a little complicated to set up, but I have included all the directions on the Github page.

There is also a beta feature which saves the speech as an mp3 file, then casts it to the virtual assistants around the house. This means I can use IFTTT or other functional-paradigm cloud services to trigger my virtual assistants to speak to me based on external commands. It does mean I have to run a single piece of hardware; the Raspberry Pi. But I had to struggle to remember where I even plugged it in. It’s hard to notice.

There are all kinds of exciting directions this project can go in the future. There is no reason it shouldn’t be able to cast tv shows or movies from my NAS for example. The potential of this product is very exciting. I look forward to future versions, and I think this is one of the few cases where the product will outlive my Levels Challenge.

Startup 10: Astria

Astria represents an exciting new version completely from scratch of a web application framework I initially started over a decade ago. Astria manages database connections and allows rapid development of data-driven applications. It is very easy to create large and complex databases which Astria can serve as a simple JSON API, or as a complete managed web-based GUI featuring OAuth or Email authentication of users.

Astria also manages large sets of complex teams with equally complex permission structures and allows complex tracking and analysis of user behavior.

I have built simple websites in Astria, and even a complex front-end to a legacy CRM product for a multi-million dollar tech company which serves dozens of users and manages all daily operations for that company.

Perhaps the most exciting part about Astria is that it is exceptionally efficient and designed to scale indefinitely. It requires very little resources even under enormous workloads. It is also designed from the ground-up for sharding and distributed workloads. One server or one-hundred servers running Astria can balance any theoretical number of users and requests seamlessly.

Astria also makes it very easy to develop and deploy quality assurance tools such as outbound survey calls and emails. This feature is in operation at the tech company I mentioned.

A last point is the simplicity of marketing automation. I am spinning this part of the product off into another product which an upcoming post will detail. With Astria, is is trivially easy to run a query on a cron job to qualify a list of destinations for a dynamically generated email. This feature has created millions of dollars in revenue for businesses using this ability of the framework. Imagine all the types of customers you could send an email to automatically without needing to think about it. For example, “We haven’t seen you in six months. How are things?” The same can be done for the one year mark, and so on. This feature can also be used for internal development. For example, the system can automatically email users as well as their managers when particular types of behavior are detected. This can save time and money, and keep employees honest.

Astria is an incredibly powerful and versatile tool which can empower, protect companies as well as passively generating millions of dollars in revenue. Check it out and let me know if you are interested in contributing to this ongoing project!

The Levels Finish Line In Sight

As the finish line approaches, I have noticed issues with several of the older projects in the list. Maintaining them seems pointless unless I plan on actually using them, so I am removing some of the projects from production while I focus on completing the last items on the list. These will also have met with limited interest from test marketing.

Startup 9: What are you wearing today?
Startup 8: Stardate.Today
Startup 7: Top Story Review
Startup 6: Exotic Weapons
Startup 5: Condensr
Startup 4: CronPUT
Startup 3: Draupnr
Startup 2: RSI Alert
Startup 1: Securities Science

All of these are still available on Github if anyone wants to check out the code or clone and reactivate them. The RSI Alert project in particular has found a following and I’m sorry to disappoint the few dedicated users who are interested in this project. Feel free to fork the repo and take it over if you’d like! :]

Three projects left!

Quit Your Mediocre Job And Get An MBA

Going to college is an investment, but many people assume that just any college is a good investment. I know a lot of people who have gotten liberal arts bachelor’s degrees and ended up stuck working in restaurants for years, unable to find a real job. This example is a waste of an education. All that money and time spent and they are no better off than someone who didn’t go to college at all. And they probably have debt to pay back.

I decided to wait over a decade to go to school until I decided what I wanted to do and came up with a coherent plan to actually get a return on this huge investment.

If you find yourself working a mediocre job which you hate, then maybe it’s time to do more. So what would happen if you just quit your job today, take out student loans, and go to school for an MBA?

According to research done by US News in 2016, 88% of students who get an MBA find a job within three months making an average of $126,919. That’s the average. Consider the average person for a moment and ask yourself whether you’ll be ahead of that curve. According to Bloomberg, the average person triples their previous salary when getting an MBA from ~$50k to ~$145k.

Cold Turkey

Imagine quitting your job today and starting the path to your MBA tomorrow.

To save money and improve your chances of getting into a good school, you decide to start by finishing the IGETC and at a community college. If you’re not working at all, then tuition is free, and you will get about $3k/semester in financial aid. Let’s assume you take out about $20k in student loans along with that financial aid to cover living expenses while getting through and IGETC. This number is deliberately high; my own amount was much lower. And I went full-time at two different community colleges to speed up the process.

Now it’s time for a four year school, BUT since we did IGETC and Assist, we’re already halfway done. Let’s assume we decide on a mid-range state school like San Diego State University and a bachelor’s that actually has job potential like engineering, marketing, or computer science rather than something pointless like psychology or art. There are cheaper options and more expensive options out there, but the important thing is to get a degree that is actually going to mean something to an employer, otherwise what’s the point?

According to, average cost (Includes tuition, room and board, supplies and other expenses) for in state students at San Diego State University is $28,224 minus an average financial aid award of $11,400. So that’s $16,824 per year. Since we did the IGETC and Assist at community college, we’re only spending two years here which comes out to a total of $33,648 which goes onto a student loan.

Alright so now our total loan principal is $50,648 and we have a valuable bachelor’s degree. Time for that MBA.

Bloomberg has really comprehensive research on this, and they put the average cost of an MBA in the US at just $53k plus living expenses. There is no financial aid for Masters students so let’s add another $20k in debt to cover living expenses while we are doing the Master’s program.

Now we have our MBA and debt of around $123,648.

Less Debt Than Income

Remember from above that within three months on average, MBA grads will be making an average of $126,919/year. We could pay all of this debt off the first year if we are as frugal as we have been while in college, or more likely we will spread it out over the next few years and enjoy some of the fruits of our labor. The point is that this amount of debt is trivial for an MBA grad. On average, the total amount of debt is LESS than the starting annual salary.

Having student loans and paying them off is a  great way to demonstrate you are creditworthy. Once you get past the educational hurdle and triple your income, you will be able to do things you never could have before.

Just Do It

It’s scary to leave the comfortable routine and reinvest in one’s future, but it makes sense for anyone smart and capable to make a choice like this, especially if they are tired of wasting time doing mediocre work for mediocre rewards. Life is too short!

Years of Problems Solved


#deletes old backups
find /var/www/backups/ -mindepth 1 -mmin +$((60*24)) -delete

#creates new backups
tar -czf “/var/www/backups/webs-$( date +’%Y-%m-%d_%H-%M-%S’ ).gz” /var/www/webs
/usr/bin/mysqldump -uroot -p[mysql root password] –all-databases | gzip > “/var/www/backups/mysql-$( date +’%Y-%m-%d_%H-%M-%S’ ).sql.gz”

This is a script I wrote years ago that shall live in infamy. It creates an automatic backup of databases and virtualhost directories. It is called by a cron job each day, and builds the new archive files, depositing them into the backups folder.

The backups folder is a Bittorrent Sync repository which automatically copies the backups to other NAS servers. This script also deletes the old backups each day as you can see at the top. Because the files are deleted on this server, the remote repositories they are syncing with retain old versions. This means that all prior backups are saved on the remote NAS server, but only the most recent backups are ever stored locally.

Because the files are transfered with the bittorrent protocol, they are end-to-end encrypted, and highly available across an unlimited number of nodes. So the remote NAS servers will share the backups with each other if necessary.

This is a super good system which accomplishes highly available and completely free, secure offsite backups. The remote server has a highly secure virtualhost which shares the backups, so they are available to other command line scripts which can fetch them and deploy new versions of these servers in seconds.

Also keep in mind this is a simplified version of the script. The actual script I use will create a separate backup file for each database and for each virtualhost. This script will create one single archive of all databases and one single archive of all virtualhosts. This is still a good system, but it is less easy to deploy one single virtualhost or database this way if a server is hosting more than one.

The Symptom

Every once in a while after some unknown period of time, the Bittorrent Sync stops working. It says there is an unknown error, and it has to be reconfigured. Then it works fine for some unknown period of time but this ALWAYS ALWAYS happens again.

It only happened on some servers, despite the same script running on all of them. (I now realize the reason.)

I tried for a long time to figure it out but I chocked it up to a bug in BTSync because it is a mildly jank, gratis, closed-source, and long-discontinued product. I just kept periodically reconfiguring BTSync and everything kept working, despite this little annoyance.

The Cause

Bittorrent Sync stores the configuration files for each repository in a hidden directory within that repository called “.sync”. This works just like git does.

When my script deletes old files in that directory, it also deletes the Bittorrent Sync configuration files, then BTSync crashes until it is reconfigured.

The Fix

This is the new script which solves this problem;


#deletes old backups

find /var/www/backups/www/ -mindepth 1 -mmin +$((60*24)) -delete

find /var/www/backups/mysql/ -mindepth 1 -mmin +$((60*24)) -delete


#creates new backups
tar -czf “/var/www/backups/www/webs-$( date +’%Y-%m-%d_%H-%M-%S’ ).gz” /var/www/webs
/usr/bin/mysqldump -uroot -p[mysql root password] –all-databases | gzip > “/var/www/backups/mysql/mysql-$( date +’%Y-%m-%d_%H-%M-%S’ ).sql.gz”

As you can see, each type of backup now goes into its own subdirectory. Backups are only deleted from the subdirectories. The config files are no longer effected when backups are deleted.



Startup 9: What are you wearing today?

Like many people, I am a student. I attend over a dozen classes at two different colleges. I also attend regular social events and networking meetups. I often wonder as I am dressing, is this the same thing I wore last time I went to the place I’m heading now?

It sounds silly, but it’s something I always ask myself because I want to make the right impression. This is further complicated by the fact that I am something of a minimalist, so I keep less than ten shirts at any given time and only a few pair of pants/ shorts.

I was talking one day to my mom about this anxiety I feel, and she emphatically agreed.

I decided to make a simple app to keep track of what I wear each day so that I can look back and see what I wore last time I was at a particular class or event and know not to wear the same thing again. It’s schmuck insurance.

Wearing.Today is the result!


The minimum viable product version of this app is not social; each user’s profile is private. Users can post pictures from a simple single-page app which also lets them edit each post’s blurb, or delete their posts.

Paradigm Shift

This app is written completely in the functional paradigm. I wrote it as a single page html/js application which is built to be hosted on S3 and take advantage of Lambda for the functions.

I did not actually deploy it there because I Don’t already have accounts with those services, and they don’t support my primary language, PHP. The point was not to actually do those things, but to rapidly write a simple, complete app in that paradigm and get a sense of the workflow and how to construct the API.

Startup 8: Stardate.Today

For years, I have calendared prolifically. (Is that a word?) I track all the granular details: my to-do list is on one calendar; all my classes and homework; my social life; gym, yoga and exercise; and previously my jobs at Tech 2u and Starbucks. Each of these topics is on its own color-coded calendar within Google Calendar. You can take a look at what my weeks look like at

There is a problem though. Sometimes I wonder what I accomplished on a particular day off or with my free time after work. I go back in my calendar to check, and there is nothing there; just a blank spot where I neglected to note the day’s events. It’s hard to quantify how well I use free time or time at work when I don’t have any record of what I accomplished. What if I could make like Janeway and just shout at the wall like it’s my journal?

Journaling always sounded interesting, but there’s no way I am going to lug around paper and a pen to write in it, and even if I did, it wouldn’t be searchable.

Enter Stardate.Today. This simple tool lets me type out my stream of consciousness into private posts, and then adds them to a timeline for me to search back through. They are also added to my calendar so I can see when each thing happened.

It takes just seconds to get started. Simply log in with google and you will be given a link which you can enter into your phone or any calendar tool. On the homepage, you are presented with a simple box to type in and a list of your past posts. When you enter a post, it is automatically added to your calendar.

Simple as that.

[Working Draft] Smarter Sockets

This is something I looked long and hard for, before deciding to build it myself. Also, a school project pulled this off the shelf and added some urgency. Last year I completed a similar but very early proof-of-concept prototype just for the web-controlled relay board. It didn’t have the parallel shift registers, the relay board, or the actual sockets. I never finished it or came close to building an actual final product.

[Take a better picture]

This prototype built on an Arduino controlled relay board which in turn drove a series of electrical sockets controlled by a web page over an Ethernet connection. [Feature link to commits where I shared this with the Arduino community.] But it also let’s you monitor consumption and prevents outages with a battery backup.

This project combines these things;

There is not currently any product that does this which I have been able to find. And as a minimalist who is committed to conserving energy, being independent, and being aware of my footprint, I was really hoping to find something like this out there.

Improving on Kill-A-Watt

Kill-a-watt is a very interesting product which nevertheless has some huge development opportunities.

It lets you see power usage for a single electric socket. There are several problems with this.

For one, it only works for one socket.

Two, it doesn’t let you see the data except through the limited interface. There are no graphs of usage over time and there is no web accessibility for the data.

Expanding on this idea, each socket on my new device will allow you to see power consumption over time. There will be a clean web-based interface which lets you see any unusual spikes and be responsible with your energy consumption. This will pair well with the smart-socket feature which will enable you to turn things off when you are not using them or when they are using too much energy.

Improving on Smart-Sockets

Smart sockets are very limited. They typically offer only one socket, and do not offer usage metrics. Also, they feature very poorly implemented security and control software.

[Explore Steve Gibson’s IoT security reviews]

Improving on these widespread industry problems will be an important and valuable step.

At the hardware level, offering multiple smart-sockets is already a huge improvement, as is offering usage metrics, but there is room to improve further. Another major feature of this project is feature granularity. I want to make sure and give enough detail so that developers can create multiple physical formats. Maybe you only want one socket, not eight. Why not?

There is no reason this system can not fit into a wall socket and replace the old-fashioned ones you already have. Imagine removing the mess of plugging devices into devices into the wall, and just put the smart-socket inside the wall.

[include diagram]

Including a UPS (Might take this out)

I started with this very thorough tutorial which does a great job of explaining the terms and options that differentiate existing UPS products.

I found a discarded 2kw UPS with a dead logic board at a computer repair shop which I was able to get for free. All 2 kilowatts of sealed lead-acid batteries worked fine, it was just a bad control board. :] This was exceptionally lucky, but you may be able to find something similar if you look.

I had also explored scavenging 18650 battery cells. These are very popular for DIY UPS builders. This great alternative option would also probably scale better than sealed lead-acid and charge or discharge much faster. There are lots of places like battery stores that will happily give you free, “dead” laptop batteries full of these cells. Typically it is just the control board and maybe one or two of a half-dozen cells which is actually bad. The rest will still work fine in most cases.

Choosing a UPS Paradigm

The linked tutorial describes three main types of UPS. I chose the Online type, “The Online UPS unit completely isolates the devices attached to it from the wall power. Instead of jumping into action at the first sign of power out or voltage regulation issues like the Standby and Line-Interactive units, the Online UPS unit continuously filters the wall power through the battery system. Because the attached electronics run completely off the battery bank (which is being perpetually topped off by the external power supply), there is never a single millisecond of power interruption when there is power loss or voltage regulation issues. The Online UPS unit, then, is effectively an electronic firewall between your devices and the outside world, scrubbing and stabilizing all the electricity your devices are ever exposed to.”

If it’s more expensive, why choose this type?

The goal of this project is radical energy independence. I want this to be expandable and compatible with eventual solar or wind power generation. This basically fills the same role as a Tesla Powerwall.

For reference, I found a great online community focused on cloning the Tesla Powerwall. There are lots of great ideas and examples in there.

Future Direction

There are tons of potential directions the project could go. For example, the smart-sockets could easily include powerline-wifi-adapters to replace wifi access points and greatly increase the wifi availability in your home while eliminating obnoxious and unnecessary hardware and wires.


This project includes a public repository of all the code and plans which anyone can contribute to, and a forum for discussing it. I will try to make it fairly modular and platform-agnostic so that people can use different hardware and still have a safe and secure system.

Building a MVP/POC


The core of the project is an Arduino/compatible processor and a network stack. This can be Wifi or Ethernet. There are some great examples which combine these parts together and even include the relay board if you would rather use that.

I will be using the brand-name Arduino Uno and the Arduino Ethernet Shield, only because I already had these. If I was buying them for this project, I would probably use the one I linked to in the previous paragraph because Wifi would be a great feature for this project.

This is going to take a lot of different parts which need to connect to the Arduino. We need to use a parallel shift register (Explanation) in order to control lots of things with just a few pins.

A simple relay driver board does the heavy lifting of turning the sockets on and off.

Measuring the current through each socket will require a series of special sensors wired inline and then connected to the Arduino. Alternatively, there are several other examples I am exploring for this part.

The main power will come from the battery bank and go through a cigarette lighter dc-ac converter before hitting the gfci socket and then the relay board. This makes the whole thing very safe because there are several circuit breakers built into each of these levels.

The battery bank will be charged by a standard ATX PC power supply which will automatically be turned on and off by the Arduino when the power level requires it. (This means we only get seven sockets since one of the eight relays will control the charging supply.

Future Steps

The most obvious future step would be making an actual ready-to-order product which people can buy. This would require a great deal of funding since there would be regulatory requirements and manufacturing costs, but I really think this is something people would buy.

If people are willing to pay $30 or more for only a single smart-socket which does not measure usage, it makes a lot of sense that people would be willing to pay even more for more features and expandability in a device which offers multiple sockets with valuable metrics about usage.