This past Sunday marked my first AutoX event with a couple of buddies at Atlanta Motor Speedway. While the day was hot, sweaty, and hungover (thanks to Jay's He-Man Juice the night before), it was an absolute blast.
The experience was very different than HPDE events I've done in the past, arguably AutoX is a different animal all-together. Honestly the most challenging aspect is actually finding the course on the first couple of runs. Unlike HPDE events which are on road courses (which have little to no variability in the route), Auto Cross (or AutoX for the initiated) uses large parking lots and "gates" defined by marker cones, where the driver has to learn the course while trying to set their best time on that course. Each vehicle is "spaced" safely from the vehicle running the course ahead and behind them, so there's no pressure of other drivers or risk of collision. Additionally, unlike HPDE, the drivers are also required to "work the course" when their run group is not running, by noting cone hits or DNFs, and ensuring the cones are in their correct position. BMW CCAs AutoX allowed 8 runs, with a time penalty of 1 second for any cone hit and a DNF for any "missed" gates. Comparing HPDE to AutoX, I'd highlight the pros and cons of AutoX as follows:
- Less risk of colliding or damaging your or others' vehicles
- More ability to "push the limits" of traction and control
- No pressure from other vehicles on the course, having to provide "point-bys" or anticipate other driver's behavior or course
- Less risk of a "money shift" since you can pretty much stay in the same one to two gears throughout the course
- Greater "appreciation" for the event since you are both a participant and a course worker
- Exercise (e.g. running to pickup cones, monitoring your assigned sector, and "on your feet")
- Less expensive
- Challenge of finding the route/course crowds out/distracts from the ability to identify braking zones, apexes, etc
- Lower speed and less distance does not lend itself to better understanding your vehicle's handling characteristics in dynamic environment
- No "persistence," so once you learn a course, it's only good for that day - the next time you return, it will be a different course
- Less seat time
- Less technical in driving requirements (e.g. no shifting or heal-toe required, limited benefit or opportunity for trail braking, little to no elevation change)
Since purchasing our house (or rather 20% of it) in September, I've dived head first into home automation. It's turned into a hobby, as it combines a lot of my interests - computing and virtualization, lighting and electrics, and working with my hands. A lot of friends and colleagues have expressed an interest in home automation, so I decided to put together a short guide around what I've done and what's worked for me. Before I lay claim that I have derived this information all on my own, I must admit it's an amalgamation of information from forums, Amazon.com, friends, and neighbors (hat tip to Brian, if you're reading).
Choosing a systemThis is very much a developing industry and technology, and in many ways is still immature. There are many offerings from many manufacturers leveraging different protocols and technologies. It definitely follows the adage, "Fast, cheap, or good. Pick two." Definitely do your research as depending on which platform you choose, you can get locked in, as not all "Smart" devices play nice with each other. The key things to consider are:
CostSome platforms are more expensive than others. Not only are you looking at upfront costs, you also have to consider subscription fees for things like cloud storage, application or plugins, etc. Ikea has a new offering that is supposed to be cheap and easy, but it's not extensible. Apple has HomeKit which is on the more expensive end, but requires less "fiddling," at the expense of extensibility and customizability. Google Home or Amazon Alexa support some native "Smart" capabilities, but they're not true smart platforms (see protocols and efficiency below)
SupportSome platforms have more community support than others, which include both troubleshooting, device support, custom logic and device handlers.
ExtensibilityIt kind of goes hand-in-hand with support, but open platforms allow developers to contribute to and expand the functionality of different devices, which tends to happen faster than the vendors themselves integrate their solutions. A good example is thermostats- EcoBee (which is what I use) has Samsung Smartthings (which is what I use) support, but there its functionality pales in comparison to a custom device handler that a community member wrote. Smartthings also includes the ability to install community-develops SmartApps, allowing for advanced orchestration and automation
Protocols and efficiencyMany devices mark themselves as "Smart," but it's more a marketing term than anything else. Just because something has Wifi connectivity doesn't inherently make it "Smart," and WIFI is definitely not an ideal protocol for smart device networks due to its power requirements and design. Below is a quick rundown of the different "Smart" protocols:
- WIFI - many appliances, thermostats, doorbells, and cameras leverage WIFI for connectivity into a the "smart" platform. While WIFI is appropriate for some of those devices (e.g. a doorbell), it is pretty inefficient for most devices. You have to take into account what WIFI was designed to do - handle a large volume of network traffic, with emphasis on throughput and speed at the expense of power and simplicity. While a WIFI plug, bulb, switch, or camera may work for your needs, they also add significant overhead in terms of number of devices, chatter, and traffic to your home wireless network. WIFI cameras in particular can bog down a home wireless network due to the low latency, high bandwidth requirements of modern 1080p or 4k cameras. Additionally, range can be limited with WIFI and you have to take into consideration distance of devices from your wireless router/access point. Generally speaking, "Alexa-enabled" or "Google-Home" enabled devices are simply just WIFI devices, which means no smarthub is required
- Bluetooth - Apple Homekit uses bluetooth for some devices, where you can use an iPad or AppleTV as a "smarthub" which acts a bridge between a TCP/IP network and bluetooth devices. Many "smartlocks" use bluetooth to interface directly with a phone or Homekit. While requiring less power than WIFI, bluetooth has even more range limitation, often requiring line-of-sight distance from a phone or an AppleTV to work. Additionally, Bluetooth is a relatively insecure platform. The newest implementation of it supports encryption, but few vendors are currently implementing it effectively. Needless to say, I wouldn't want my home's physical security subject to the vulnerabilities to Bluetooth (ie a door lock)
- Zigbee - a purpose-built home automation technology, it is a low-cost, low-power protocol. It is gaining popularity but is inferior to Z-Wave (see next point) due to lack of interoperability, fewer device and manufacturer support, and lack of forward- and backwards-compatibility. You'd need a smarthub (e.g. Samsung Smartthings, Wink, Wemo, Homekit, etc) to bridge between your wireless network and zigbee devices.
- Z-Wave - a purpose-built home automation technology, it is low-cost, low-power, and has the widest manufacturer and device support. It operates at 900MHz so it doesn't interfere with WIFI networks (2.4GHz or 5GHz), although it could interfere with a cordless phone (although who uses those anymore?). Being lower powered, z-wave is good for wireless devices (e.g. motion sensors, door sensors, etc) All Z-wave devices can talk to all other z-wave devices within a network. Additionally, Z-wave devices act as a mesh network, so the more devices you have in your house, the stronger your network. Z-wave will allow a maximum of 3-hops among devices to transmit to the hub, which is nice. So you could hit a smartswitch at one end of the house, and the signal will "hop" to another switch, to a smart outlet, to your home hub. You'd need a smarthub (e.g. Samsung Smartthings, Wink, Wemo, Homekit, etc) to bridge between your wireless network and zigbee devices. I've standardized my home on z-wave devices.
- Thread - a relatively new wireless, low-power home automation protocol founded by manufacturers such as Google (and Google-owned Nest), Samsung, etc. I don't know much about it, but a quick search shows it's based on Zigbee technology and should mirror Z-wave in terms of functionality
LightingThere are lots of "smartbulbs" such as Philips Hue or LifX which change color, dim, and turn on and off with a smartapp. I initially started looking at Philips Hue as an option for lighting, but quickly found that it did not meet my needs. It's expensive, requires its own hub device, I didn't need the color changing features, and I had read it can be unreliable. I did a bit more research and stumbled on Z-Wave smart dimmers and switches (I use GE Z-Wave smartswitches, but there are other options such as lutron). I like using smart switches rather than smart bulbs since I can continue to use traditional light fixtures and bulbs and can control the lights via apps or via the switch itself. Also, it fully supports three-way and four-way switches (e.g. when you have two or more switches in your house that control the same light(s)). The downside is you have to be comfortable with home wiring and replacing traditional switches
CamerasI did a ton of research into cameras and ended up choosing Ubiquiti Unifi as it suited all of my requirements. Most notably:
- Wired - I didn't want to use wireless cameras for both security reasons and because I didn't want to saturate my wireless network with camera traffic
- ower-over-ethernet (PoE) - Power-over-ethernet is as exactly as it sounds - you can both power and receive camera data via one ethernet cable (e.g. Cat5E or Cat6). This eliminates the need for finding a power source or plug nearby, or worse, having to use and recharge batteries
- No cloud storage required - I really did not want to pay a subscription service for cloud storage of my video footage since I have a perfectly good (albeit old and slow) NAS at home. Cameras such as Nest or Ring require a monthly or yearly cloud subscription. With Ubiquiti I can specify how many or for how long to store video footage Linux support for the NVR - probably not a big consideration for you, but I really wanted full control of my NVR and the ability to install it on Linux since thats primarily what I use for my home lab and setup. Ubiquiti Video NVR also supports Windows and Mac, so you'd be okay there.
- Advanced Motion zones and detection - I can define motion zones to monitor (and not monitor) to trigger recording as well as specify the duration to record Camera NVR - if you don't want to go with Ubiquiti cameras (they are relatively expensive) take a look at Foscam PoE cameras and BlueIris NVR software
My SetupTo give you an overview of my setup, I'm using the following:
- Samsung Smartthings Hub
- Using WebCore for advanced logic and orchestration
- Ubiquiti Unifi Video Cameras (for external cameras and security)
- GE Z-wave smart switches
- GE Z-wave smart outlet
- Ecobee v3 thermostats (I deliberately chose v3 since I didn't want or need the built-in Alexa functionality in v4)
- Ring Pro Doorbell
- Foscam wireless camera (for internal monitoring and recording)
- Ecolink motion sensors (for internal and external motion detection for security and light automation)
- Ecolink rare earth magnet door sensors (for door monitoring for security and light automation)
- Google Home
- Google Chromecast
The best word to describe the first few sessions: frustrating. Having tracked the BRZ nearly a dozen times at AMP I had gotten comfortable with the lines, brake zones, and corner speeds in that car, which I quickly found did not translate directly to the Miata. Not that the BRZ is a power monster, but compared to the Miata, I could miss an apex or deviate from the perfect line and still run 1:43s consistently. Not so with the Miata. One bad corner, a little too much application of the brake, or deviating from the race line at all, and I was hanging in the 1:51s (not great).
But by the second half of the day, once I had gotten a better feel for the car, learned the near limits of the tires and suspension, it became much more fun. Once I accepted the fact that I'd be giving point-bys most of the session, when I found myself some open track, the car was amazing fun. I could carry a lot more speed through certain corners, I could brake later in others, and I could hold tighter lines than in the BRZ (and certainly than in the understeering STI). Perhaps it was having the top down, not having any electronic nannies, or the direct feedback from the steering, but I certainly see the appeal of the NA Miata on a track like AMP.
At the risk of coming off as condescending, there is a certain pride of hanging with more powerful cars. The occasional time I earned a point by from a late-model mustang or E36 BMW was all the more rewarding. The thing about the Miata is if you don't get your braking just right, or hit your apexes perfectly, you're in for a frustrating lap. But get it right, and every second off your previous time is all the more rewarding.
One item of note is by the end of the day, I was pretty beat up (physically). Having the stock leather seats and original 95' suspension led to a lot of leg and knee banging against the door and transmission tunnel. Some Sparcos, a race harness, and some new coilovers with a bit of life left in them would serve this car well.
Would I take the Miata on a faster track like Road Atlanta or Roebling? Absolutely not, unless I had a penchants for masochism. AMP or Barber? If the weather is alright, and of course, with permission from Lydia.
Ever since high school when I helped a friend and her father restore a '67 Mustang Convertible, I got the itch. From the satisfaction of working with one's hands, learning the intricacies of motors, suspension, brakes, and electrics to restore a hunk of metal to it's former glory, to appreciating the sweeping and elegant lines of the original pony car, I was hooked. I knew one day I wanted to pick up a project pony car of my own, to restore and rebuild my own way. And while I've come across a few suitable candidates (including an original '65 GT in St. Simons that I came very close to buying), I've mostly put the idea on the back burner to focus on my Subies, saving for a house, and other hobbies. That was until my friend Jonathan sent me a few quick photos of a rather clean '66 his coworker was selling, for an unbeatable price... $1500! Quite literally a great deal, I could have probably flipped the car and made $1000 - $2000 the next day, however I'd never forgive myself for missing the opportunity of restomodding such a suitable candidate:
Relatively rust free, a one-owner Georgia-only car with the Sprint 200 I6 motor. Pretty much the perfect candidate for a restomod. Since helping restore the '67 in high school, I struggled with whether my eventual car would be a by-the-book restoration or a restomod, over the years of reflecting I landed on wanting a visually stock restored car, but with the modern comforts and reliability of a newer car. I knew I wanted a first generation model (64 1/2 - 66) as I like the more diminutive body of the original. A cheap I6 would be the ideal candidate for my project, since no matter what I'll be replacing it with a electronic fuel injected (EFI) 5.0L motor, more modern automatic transmission, a new rear-end, and disc brakes - and sure enough this car found me.
While I don't have the towing capabilities, space, tools, or knowledge to perform an engine and drivetrain swap, I'm fortunate enough to have a friend as good as Jonathan who does. Aside from his mechanical, fabrication, and general automotive expertise, he also has a keen eye, honed skill, and seemingly endless luck in "wheeling-dealing," something for which I seem to have systematically lacked - that is until this project!
As with most of my blog posts, this one is coming a little late as I've already begun my project, but I'll play catch-up on my posts over the next few days/weeks and use my blog to document our progress. As a teaser, I've already found my donor car and have begun the process of tearing down the Mustang in preparation for the transplant.
Eat a bag of dicks, Russia.
Last evening, I was alerted by one of my web server tenants (Lydia) that the web server may be down. "Impossible!" I thought to myself. My CentOS web server has been extremely stable and I've taken great pains to secure it (that's not a challenge, by the way, leave me in my happy ignorance). In the past, usually this has meant that our connection has gone flakey or the router needed to be rebooted. But this time, our connection checked out as I was able to resolve external web sites, just not our hosted sites. hm
I attempted to SSH to the CentOS VM from within my network, and no love. Again, hm. I popped open a console to the VM and sure enough it was bogged down, unresponsive. Well that's not great. My first inclination was that the disk could be full, as web devs aren't particularly careful when uploading JPEGs for their sites. Need an 300x200 JPEG on your wordpress page? Ah yeah, this 3176x1571 image will do. I rebooted the VM and watched it come back up with no issues. I logged in and ran df -h. Only 57% disk utilized - so not the disk. Then before I could run anything else, the VM started locking up again. Whiskey Tango Foxtrot?
Another reboot, this time to runlevel 3, and I quickly tail -f /var/log/messages where I see complaints about httpd eating all my memory, then the VM locks up again. This ain't good. Another reboot, this time runlevel 1 (shit's getting serious) and I dig into /var/log/httpd/error_log. Sure enough, I see [info] server seems busy, (you may need to increase StartServers, or Min/MaxSpareServers), spawning 16 children, there are 41 idle, and 129 total children. I think, "that's odd, I'm only running four virtual hosts, all of which are not heavily trafficked." I shutdown port 80 on the external firewall and run init 3 to bring all services up. Server seems stable and I can access sites internally (after a bit of fumbling, since I use the ServerName directive for directing traffic, which works all well and good when DNS resolves and the sites are available externally. Not so much internally without host files). The server remains stable - this stinks of a DDoS. But why? Who would waste the effort in DDoSing matt5lot10.com (although I realize it is one of the hottest sites on the web). A quick peak in /var/log/httpd/access_log gives me my answer. My DDoS attacker is 126.96.36.199. A quick ip-tracker.org lookup shows Moscow, Ruskie. Whiskey Tango Foxtrot, seriously?
My first inclination was to just block the one IP on the firewall, but after having to downgrade my lab after moving last year, I didn't have PFSense to do so and my firewall device leaves some features to be desired... (DDWRT may be in my future). So looks like we'll need to rely on good ol' iptables. And f*ck that IP, I'm blocking all of Russia and China - no matt5lot10 for you! A bit of googling as to how to best blacklist entire countries turned up this Mattwilcox blog post which uses handy, dandy ipdeny.com to create an ipset which can be blocked in iptables. I had to modify his scripts a bit (it looks like he uses a Debian-based Linux and my bash scripts have a bit more error checking to ensure a zone file exists in /etc before attempting to delete it.
Sure enough, once the ipsets were in place and iptables deny rules implemented, Apache is again swimming along and the DDoS attacker(s) can eat a bag of dicks.
So why DDoS matt5's IP? Who knows - probably some Vodka-soaked Ruskie had too much free time on his hands and is just pointing H.O.I.C at US IPs. What's more concerning is it could be state-sponsored Vodka-soaked Ruskies carrying out these and other attacks - it probably explains how Trump was elected.
Either way, no more matt5lot10 for China or Russia, they've lost their privilege of matt5lot10.com.
I attended my first SCCA event, "Track Night in America" at Atlanta Motorsports Park on August 24th and had a good time. The event is structured like an abbreviated track day "lite", with three run groups, each run group having three 20-minute sessions. The track night was fun and it was nice to be on the track in the evening rather than in the sweltering heat of day.
My only complaints are the event was a bit disorganized (if not dangerous) due to a lack of coaching/instruction, and one dumbass in a yellow MG "Peanut" who apparently chose to run in the intermediate run group, despite not having a clue as to how a track day works. Dude completely ignored passing zones, apparently had no comprehension as to what a "point-by" is (basically a means of allowing safe passing where the driver "points by" a car in his mirrors).
Despite "Peanut," the track night was a success. Here's some footage of one of the last runs during the evening as the sun was setting
I've since dug up the paper and gave it a re-read and reminiscing in my days of Econ and IT wonk, so in the spirit of Open Source, I've uploaded the paper to matt5lot10.
Here's the abstract from the Thesis
The market for software is in many ways distinct from that of any other good due to software's strong network effects, imperfect public-good nature, and complexity. As a result, the free market has two primary means of delivering software to consumers: the proprietary method, and the free/open source method. These two contrasting methods of software production may appear to be mutually exclusive, and many believe their coexistence to be unsustainable. This study investigates the theoretical and empirical literature regarding the rationale and motivation for, and welfare impacts and overall efficiency of, each software development model and how they interrelate. We can conclude from this study that these two models are in fact not mutually exclusive and in some cases complementary, provided certain market conditions are met. We can also conclude that the open source software development model is in fact consistent with economic theory and is thus a sustainable method of software production.
It was my first track day at RRR so all the lines were new. I can't say it's my favorite track - it's nearly all right-handed turns on a pretty flat circuit with lots of sand, but the location can't be beat - 20 minutes outside of Savannah, weather was awesome and camping was great.
So after owning the BRZ for 2.5 years with no power upgrades, I finally bit the bullet and installed some goodies to give it a bit more power and help reduce the all-too-familiar torque dip on the FT86. The upgrades performed this past weekend include:
- Moto-East Flex Fuel Kit
- JDL UEL Header
- ISR Overpipe
- EcuTek Tune
All told, the modifications should add about 30whp, bumping it up from 175 to 205.
For your viewing pleasure, here's a timelapse video of the installation. Lots of standing and talking, but that's part of the process
Quick one but useful, as I wasn't able to find this information online or in Samsung's support manuals.
We have a Samsung TV - UN50EH5300FXZA and a Samsung Home Theater/BluRay - HT-H5500W. We've been having issues with volume controls on the Home Theater with audio output from the TV and devices plugged into the TV (Cable box, Raspberry Pi, laptops, etc). Initially, I had HDMI output from the HT-5500W to the TV as well as an AmazonBasics optical cable output from the TV to the input on the HT-5500W, as audio was not outputting to the home theater. While this worked for the most part, we were having on-and-off issues of the TV not being able to output audio to the the home theater, as well as odd volume behavior where the Home Theater would seemingly randomly increase volume on the rear speakers when increasing or decreasing the volume on the system. I did a bit of research and found the following:
- No Optical cable is required from the TV to the home theater because of ARC (Audio Return Channel). See this article for more info, so I disconnected the cable
- Only some HDMI cables support ARC. Fortunately one of my cables is an Amazon Basics cable which does. I made sure to use this cable to connect the Home Theater to the TV
- Most TVs mark one specific HDMI port as supporting ARC. Our TV (UN50EH5300FXZA) does not. After some troubleshooting, I've found that HDMI port 2 worked
- I believe that in order to use ARC, the TV must recognize the Home Theater as an AnyNet+ device
- On the Samsung Home Theater device, under Audio settings, ARC was set to "Auto." I set it to "On"
- It wasn't mine
- My buddy didn't buy HPDE insurance
- I was driving it like I was the STi and not staying on the top end of the rev range
Featuring Adam out front. Music by Brand New.
In a serendipitous procession of events, the TooheyMobile is no longer in the stable, as it had to be surrendered to make room for the new toy above.Nothing was inherently wrong with the Subaru Legacy - in fact I parted with it rather regrettably, but I just couldn't turn down the deal on the STI. I happened to run a search on AutoTrader one Friday for "wanted cars" (as I'm apt to do), when a new listing popped up from the same dealer from which I purchased Zed. Having no photos and a limited description, the ad was austere, but two very important data points were present; mileage and price. Recognizing the potential deal, we hopped in the Legacy and headed over to the dealership to scope out the car in person (mostly just to confirm that the listing was in fact not a mistake). Before I knew it, I was signing paperwork and organizing finances, all before actually having the opportunity to actually sit in the vehicle, let alone test drive it. So for all of those curious, here are the stats (and essentially why I took the plunge):
- 2004 Subaru Impreza WRX STI
- 36,000 miles
- One owner
- Relatively unmolested
- 20% below KBB value
RedHat has a really useful guide for sharing files/directories between services (e.g. Apache and Samba).
Although not the most secure way of doing things, I've gotten fed up/annoyed enough times having to scp files and use command line text editors to update this site to go ahead and implement it to allow me to use graphical editors on other workstations on the network. I've ensured that the pfsense firewall blocks external smb access and iptables will only accept connections on a specific set of subnets.
Here's the article. I basically followed it verbatim, although I used the samba configuration parameters "force user" and "force group" to ensure the correct user was being used to allow me to write to the actual files.
My new firm uses Google Apps, along with myriad web-based SaaS providers as their office suite, which is AWESOME! No more Outlook, no more Lync, no more Windoze. This is great news for a Linux geek. When I''m working from home, I often use my Fedora-based workstation (which also happens to be my NAS)
Times were good, I was cruising along until this morning when I had to jump on a WebEx for a training call. I was able to dial-in using Google Talk, but when I went to jump on the WebEx itself using Chrome, I was served up a fat "webex.exe." Lame. And I wasn''t about to install Wine or use my KVM Windows VM. After some quick googling, I found the Cisco WebEx support page, which clearly states that only Firefox is supported on Linux, and requires Java... Okay, so I flipped over to Firefox and checked to make sure my java plugin was configured and installed... but it wasn''t. Balls. Okay, so I had to symlink the Java JRE library for Firefox... but I couldn''t find libnpjp2.so anywhere, even though the JRE was installed. Okay, so I pulled down the latest Oracle Java JRE, installed it, and symlinked to the libnpjp2.so library:
ln -s /usr/java/latest/lib/amd64 /usr/lib64/mozilla/plugins/libjavaplugin.so
Then I went ahead and relaunched the webex, and SUCCESS! Well sort-of. I could get the Java webex application open, but I didn''t see any dial-in prompts and I couldn''t see the shared screen session. It seems once again, I''ve opened pandora''s box that is Java. Ick. To spare you the gritty details, I found that I was missing a number of packages (which for the record are mostly antiquated or not listed on the WebEx system requirements page), and I found this article which lists the necessary packages on Fedora 19 (and consequently FC20).
# yum install icedtea-web pangox-compat.i686 libXmu.i686 java-1.8.0-openjdk libgcj.i686 mesa-libEGL.i686
# setsebool -P unconfined_mozilla_plugin_transition=off mmap_low_allowed=on
For the record, I''m using the IcedTeaPlugin (/usr/lib64/IcedTeaPlugin.so) rather than the true-blue Oracle Java JRE plugin. Hope this helps!
I've got a dual monitor setup with two hanns-G hi221 monitors, both are using my motherboard's integrated Radeon 3000 gpu; one connected to VGA and one to DVI. After upgrading to Fedora 20, the monitor connected to the DVI port would blink on and off intermittently (and very annoyingly). After hours of troubleshooting with xrandr (this reference was invaluable) I found that the auto-detected refresh rate was too high for the monitor. With some guessing and checking, I found a refresh rate of 54Hz to be the sweet spot. Here's how I set the custom resolution:
- sudo xrandr --newmode "1680x1050_54.00" 130.25 1680 1776 1952 2224 1050 1053 1059 1086 -hsync +vsync
- sudo xrandr --addmode DVI-0 "1680x1050_54.00"
- sudo xrandr --output DVI-0 --mode "1680x1050_54.00"
- ESXi Host
- HP ProLiant MicroServer Gen8 Ultra Micro Tower Server System Intel Pentium G2020T 2.5GHz 2C/2T [Link]
- Kingston 16GB (2 x 8GB) 240-Pin DDR3 SDRAM ECC Unbuffered DDR3 1333 Server Memory [Link]
- SAMSUNG 840 EVO MZ-7TE120BW 2.5" 120GB SATA III SSD [Link]
- Gigabyte GA-MA78GM-S2H w/ AMD Athlon 64 X2 2.7GHz and 4GB RAM
- 3 x 3.5" 1TB SATA II Drives
- 1 x 3.5" 500GB SATA II Drive
- 1 x 2.5" 250GB SATA II Drive
- Managed Switch
- Intellinet 24-Port Gigabit Ethernet Rackmount Managed Switch[Link]
- Fedora Apache Server Restored to service (PSU Replaced). This bad PSU resulted in the 5 days of downtime for Matt5lot10. So much for my goal of 99...
- Fedora updated update to FC19, Apache, PHP, and MySQL installed to host this site
- NAS built with BTRFS (1TB and 500GB drive) with SAMBA4
- Music and Movies transferred from HTPC to NAS. Initially rsync via Samba shares, but in the interest of speed (and to learn), I moved the disks into the NAS using vgexport and vgimport.
- HTPC hard-drive swap and Ubuntu 12.04 LTS reinstall (resolution of HDMI/display issue)
- LVM disks repurposed for BTRFS NAS dives
- Mattbook Pro updated to OSX Mavericks
- Full NAS build-out once this baby arrives
- Configure iSCSI
- Mini-rack install
- ESXi build
- vCenter Build
- Switch configuration and VLAN creation
- Matt5lot10.com moved to Highly-available VM
- AD, DNS, LDAP, and DHCP service standup
- VDI deployment
- Matt5lot10 updates - particularly the "Where's 5" feature, assuming Google+ API team gets its act together
I'm back! It's been a while since my last post (technically I posted a few weeks ago, but that got erased when I upgraded my server), but I've settled down back in my homeland after traveling for nearly two months aroudn Australia and the US
I'm happy to announce that both I (Matt5) and my website (Matt5lot10) have relocated to greener pastures. Well perhaps not I (Melbourne was a pretty green pasture), but Matt5lot10 has been migrated from my previous hosting service (InMotionHosting.com) to my personal server (matt5lot10.com). No hard feelings against InMotion, I actually really liked them and their services, but now that I'm back Stateside with my Linux server, I figured it wasn't worth the $130/year to host this site.
Matt5lot10 is now sitting safely on a (not-so) highly available Linux server in my apartment. It's actually just a staging area until I finish my home lab build out and virtualize the whole thing
In related news, I've begun co-authoring a new blog, IceCreamInTheDungeon. The other contributor, Jonesy, as thus far contributed far more to the blog, but I'm hoping to add more posts soon as I begin my home lab build-out. The blog isn't really meant to attract too much attention, its really just kind of a handy guide to reference.
I'm hoping to ramp up updates to this site and that blog in the coming weeks, so check back soon for some fascinating updates from Matt5Lot10.com
So development efforts have been placed on the backburner these past couple of months thanks in large part to work. Excuses, excuses, I know, but over the past three months I've had my sister and her boyfriend visit, traveled to Tasmania, India, and Fiji, and my client project has been racing towards service transition up until last week. Just for good measure, I'm also preparing to move back home to the States in a month, with a couple of months of more travel around Australia and the US in between.
That being said, I am taking a "mini-retirement" over the next two months, which could mean some free time to poke around on some personal projects, be them web-dev related or otherwise. First and foremost, it's high time I built my own virtual lab at home to give me a platform on which to build infrastructure, host web servers, and keep abreast of the latest developments in the field of IT. Case in point, with the lack of a suitable home lab, I haven't yet had the opportunity to actually use Windows 8. It's the first Windows OS that I haven't at least had the opportunity to poke around with since Windows 3.1.
But I digress. These next two months are going to be fun ones, and hopefully once my feet are firmly settled on US soil, I'll be able to jump back into personal projects that I can share on Matt5lot10.com
Since purchasing my DSLR camera last year, I've been having a play with some different photography techniques, including long-exposure, light-painting, motion photography, and HDR composition. After an educational crash course in photography with a friend, Dennis Advani, who in a former life was a scientific photographer, I've been honing my skills and trying new techniques.
After impatiently waiting for new kit that I purchased for the camera, which included an adjustable neutral density filter and a digital shutter remote, I stumbled upon the Canon EOS utility which can accomplish the same functionality as the digital shutter remote. Not wanting to leave my camera and laptop somewhere out in the wild, I decided to give it a try from the safe confines of our apartment balcony. A great idea for someone who lives on the top floor of a skyscraper, or perhaps on a mountainside, but not so much from our humble perch in Tribeca. Not to be discouraged, I positioned the camera to catch what little night sky we can see, and let the camera fire away for four hours. Below was the result:
After four hours of 13 second exposures every 3 minutes, the camera battery died, leaving me with roughly 80 shots to be tied into a stop-motion footage. I attempted to compose the clip first with iMovie, but due to Apple's cut-rate bundled software, I couldn't get it right; namely because when importing photos, you can't reduce the clip duration below .1 second. I ended up relying on Adobe Premiere to acheive the final result. The clip is a bit rough due to the limited number of clips with which to work, but it was a good learning experince with timelapse, and I can take a more honed approach next time. I'm hoping once my new toys arrive, we can go on a camping trip out in the bush and with a little luck with the weather, capture a more picturesque shot.
Check out Dennis Advani's Android App PhotoSuitecase, which aggregates your photos from online providers for fast offline access!
So after slacking off for a number of months, I figured I'd at least seal up some loose ends (if only temporarily) and link the remaining top bar links to 3rd party tools. Most notably, 5Brew currently links to my Hopville profile and 5Pics links to my public Picasa album. Excitingly, I did finish the Hi5 page. So at least for now, there are no dead links and the site is somewhat tied together.
Fear not though, I still have some plans in the work for the site, mostly around pictures and videos. With the purchase of a EyeFi card in the near future, I'm hoping to set up an in-house gallery that automatically posts photos taken from my DSLR via FTP. I have a pretty clear idea as to how to accomplish it technically, I'm just thinking through what I want to do regarding pictures that are automatically uploaded that I don't want on the interwebs. And no, it's not what you're thinking- not because I'm afraid of uploading obscene photos, but because I often take a lot of pictures of the same subject, just using different exposures. I typically end up deleting half of them, so if they end up being automatically uploaded, it'll be a lot of noise to sort through to see the good stuff.
Additionally, I'm trying to think of what additional content I want up on the site, as of right now, it's a bit sparse. Okay, so it's really sparse. But I want to navigate the waters carefully, and not fall to the sinister temptation that plagues so many of my generation today, and make this site gaggingly narcisstic. It's a fine line between simply having one's own personal web site and being clinically diagnosed as a megalomaniac (fine, not that close, but it is an issue). Furthermore, I'd consider introducing mechanisms for allowing visitors to post content, but what's the point, now that everyone and his dog have a Google+, Facebook, Twitter and LinkedIn account and can upload his latest turd (metaphorically speaking of course... well except for sites like RateMyPoo.com, in which case it is quite literal) for the world to see. Plus, who would post anything up here anyway? I have no illusions that Matt5lot10.com isn't going to become your next homepage (in fact, if it did, that'd be a little creepy), so I don't think visitor or user-provided content is the way to go. I guess for now, humble Matt5lot10 will remain humble, where I can occassionally rant or have a play with some new api, open source tool, or code that I've wanted to try out.
So I guess the next steps for now are ambiguous. I'll probably include a links page at some point, and maybe dress up some of the existing pages. I think the next task will be to create a simple back-end that let's me post posts using a rich text editor, rather than just punch these posts in via phpMyAdmin. Heck, maybe I'll create an android app. Why not?
The site is coming together, albeit a bit slowly. Since the last post, the blog page was added (which features these and eventually other posts), as well as the Employ5 page. In the next few days, I'm hoping to polish up the Employ5 page a bit more, and tackle the Hi5 page.
Honestly, I spent more time working with Banshee on my Ubuntu home theater PC to ensure it was correctly scrobbling than I did with actually writing code. As it turns out and older version of the Banshee last.fm plugin reported timestamps incorrectly (using negative values), and didn't properly format the entries to replace XML special characters (i.e. &, ", etc).
That's it for now! Happy 2013!
Although not moving at a lightening pace, if you've visited this site before, you may have noticed that new features have been introduced, including the photography slideshow, the integrated location awareness (using Google Latitude API), and the slick new Matt5lot10 Logo.
Although no pages (other than the main page) have been created as of yet, they are forthcoming. Currently I'm still trying to get a feel for how I want the site to look and feel and whether I should leverage "cloud" services (e.g Picasa, Hopville, etc), implement "canned" code, or develop custom solutions. For the moment, I'm going to continue forging ahead with custom development, to dust off my old web dev skills and maybe learn something new in the process.
The Google Latitude integration was a pretty cool project, although it ended up being much easier (and take far less time) than anticipated. It's amazing how easy JSON is to use with PHP. I was purely an XML guy after implementing some AJAX stuff on previous projects, but I've since revised my opinion to be more accepting of JSON.
CSS still continues to be a little bitch, but I'm forging ahead and trying to follow proper form. Positioning is probably the most frustrating, but also ensuring that the PHP/backend coding follows good form while also trying to keep the CSS simple has proven challenging.
Keep an eye out for more updates. I think the first page I'll tackle is the blog page.