I gave up drinking 3 months ago, but still have a fridge full of beer which I didn’t want to pour down the sink, so I decided to make something with it. I found My Fridge Food, which is an awesome little site that will take the contents of your fridge and cupboards and suggest recipes! A lot of the recipes were rubbish, poached eggs,
I found this honey beer bread recipe from Michelle’s awesome Brown Eyed Baker blog, and armed with a bottle of Desperado totally destroyed my kitchen.
Like, completely destroyed it. Flour, everywhere. Mostly on me. But an hour later, I took out of the oven some delicious beer and honey crack.
Back in September Google released it’s new Hummingbird algorithm, apparently affecting 90% of search queries, as it strives for greater accuracy and more relevant results. Aimed at ‘conversational searches’, like what is the best cake? rather than simple terms such as best cake. (when is the last time you made a search request like that?)
Then on the 4th October they released Penguin 2.1, aimed at reducing the quality of spammy backlinks, I assume the theory is that better sites have better backlinks, (as they have been around longer?) and newer sites, (trying to use spammy backlinks to gain traction?) can’t be as good or as relevant. Fairly naive thinking, though I guess it might help in ‘cleaning up the web’?
Quality Of Results
If Google’s main aim is to provide better quality results. Then lets look in an honest search query that I’ve just made! nginx sip proxy! At work we need to find out if we can proxy SIP through Nginx using web sockets, so this kind of query is the kind of thing I’d use to start my information search!
Google Results
The first result from Google, is from the Nginx documentation, but that page doesn’t have the word SIP on it anywhere, so how can it be useful to me? Sure, it’s very authoritative (although actually out of date) about Nginx and HTTP proxies, but it’s the least helpful result ever.
Bing Results
The first result from Bing is more relevant, a topic on the Nginx forums asking about exactly what I’m after.. using Nginx as a SIP proxy. The results after that are also a lot more relevant, other people discussing using Nginx as a SIP proxy!
Google Are Making The Internet Worse
In all likely hood, Google will probably start ranking this page for nginx sip proxy by the end of the day, if they do I will tag everything I learn on to it to make it actually helpful! Google seem to be on a quest to create a really high barrier to entry for new web developers, preferring to return old, out of date information, whilst at the same time dumbing down their search results and losing sight of what made them great in the first place:
There are several things to think about when taking control of another computer, the Operating System running on it, the speed of your network connection and the tools you have at your disposal.
How To Take Control Of Another Computer
Operating System
There are 3 main choices of Operating system that the computer you want to take control of might be running, Windows, Mac OS X and Linux. Fortunately Mac OS X is based on BSD so the tools you would use to take control of it are the same as you would for Linux, simplifying things some what!
Network Speed
If you have a fast network it’s possible to use remote desktop tools, such as Windows Remote Desktop, Virtual Network Computing (VNC) or on a Mac or Linux server you can do X over SSH.
How To Take Control Of Another Computer
Windows
To enable Windows Remote Desktop, click the Start button, click All Programs or Programs, and then click Accessories.
Mac / Linux
How To Tunnel X Over SSH
If you want to know how to take control of another computer that has X Windows on it, e.g. a Mac or a Linux machine, but even really old Solaris or other styles of Unix, then you need to make sure the machine you want to take control of has in it’s sshd_config file
X11Forwarding yes
You can then connect to the remote machine with
ssh -X hostname
Then any GUI applications you run via the command line will magically appear locally on your machine.
We’re using an old version of Upstart, on Centos, to manage stopping and starting our Node.js daemons, and one of the things the script does, like any good deamon, is change the user of the deamon process from root to something more applicable, security and all that 😉
Which is nice, as it means we can use Upstart to stop/start/status deamons really nicely. The equivalent init.d script looked really horrible.
But there’s one massive caveat, which we always encounter when building a brand new box, from scratch.
2013-09-27T10:50:10.174Z] (upstart) Starting amazing-daemon
sudo: sorry, you must have a tty to run sudo
sudo: sorry, you must have a tty to run sudo
So it all falls apart due to the following error:
sudo: sorry, you must have a tty to run sudo
Basically sudo is stopping the process from running because Upstart doesn’t have a TTY. This is easily fixable. Just edit /etc/sudoers using visudo and comment out
Defaults requiretty
i.e.
#Defaults requiretty
Now we can use Upstart to start the daemon and check it’s status to confirm it’s running! More recent versions of Upstart don’t need this hack. One day I’ll live in the future, but not today.
deploy:amazing root$ start amazing
amazing start/running, process 3965
deploy:amazing root$ status amazing
amazing start/running, process 3965
I’ve got a rather large dataset that I need to do a lot of processing on, over several iterations, it’s a 20gb zip file, flat text, and I’m impatient and don’t like not knowing things!
My new favourite Linux command line tool, pv (pipe viewer) is totally awesome. Check this out:
But at appropriate moments I’ve piped the output in to the pv pipe viewer tool to report on some metrics. FYI the -N flag lets me set a name for the pv instance, and the -c flag is to enable cursor positioning so we can use multiple instances of pv!
The reason pipe viewer is totally cool is the extra sneaky data we get!
Pipe Viewer Is Magic
Because the first instance of pv is reading our urls.gz file in itself, it can display how much of the file it’s processed and roughly how long it will complete. MOST USEFUL THING EVER! Also I had no idea how large the compressed dataset was and was hesitant to uncompress the data as I wasn’t sure how big it would be, we can see from the pv instance named zcat that zcat has so far spat out 93.4GB of data, at 67% through we can predict this file is probably around 140GB if we extract it. How cool is that? We can also tell from the pv named perl that after splitting and removing the data we don’t want, we’ve so far shaved off 10GB, which is kinda interesting to splurge over for a bit, and lastly with the named gzip pv instance, pipe viewer is telling us the size of the output file we’ve generated so far.
This is totally rad.
Note. Many thanks to Norway for forcing me to rewrite my initial one liner of
zcat urls.gz | sed 's/|/ /g' | while read a b c d ; do echo $b ; echo $c ; done | grep -v ac.uk$ | gzip > hosts.gz
One of our applications (Freeswitch) just randomly crashed for no apparent reason and didn’t write anything to it’s log files. The service we’re trialling is currently in Beta so there’s room to muck about and do some diagnostics. I want to make the kernel dump a core file whenever Freeswitch dies, in case it happens again, so that we have some stuff to work with after the fact. It’ll also shut up my QA manager.
Check The Current Linux Core Dump Limits
ulimit is used to specify the maximum size of generated coredumps, this is to stop apps chewing up a million GB of RAM and then blowing your disk up, by default it’s a 0, which means nothing gets written to disk and no dump is created!
hstaging:~ # ulimit -c
0
Change The Linux Core Dump Limits To Something Awesome
To set the size limit of the linux core files to 75000 bytes, you can do something like this
Enable Linux Core Dump For Application Crashes And Segfaults And Things
Ok, so we want this to persist across reboots so that basically means we have to stick the ulimit command in /etc/profile, i’m putting this at the bottom of mine:
this basically says when the application crashes create a coredump file in /tmp with a useful name pattern
kernel.core_uses_pid = 1 - add the pid of the crashed app to the filename.
fs.suid_dumpable = 2 - enable linux core dumps for setuid processes.
kernel.core_pattern = /tmp/core-%e-%s-%u-%g-%p-%t - crazy naming pattern for a successful core dump, here's roughly what all the bits mean:
%e - executable filename
%s - number of signal causing dump
%u - real UID of dumped process
%g - real GID of dumped process
%p - PID of dumped process
%t - time of dump (seconds since 0:00h, 1 Jan 1970)
super usefuls. Then run sysctl -p so it takes effect yo!
Now here’s the last part. When you want an application to core dump you create an environment variable, before you start it, telling the kernel to sort itself out and get ready to dump, if you want all apps on the server to generate core dumps then you’re going to want to specify this variable somewhere near the top of the process chain. The best place for this on a redhat style box is /etc/sysconfig/init, so stick the following in that file
DAEMON_COREFILE_LIMIT='unlimited'
now might be an idea to reboot to force it to be set across all applications and things
Enabling Linux Core Dumps For A Specific Application
This is the slightly less rebooty version of the above. Rather than force the environment variable to be loaded when the box starts, we just stick it in the init script for the deamon, and then restart the daemon.
In /etc/init.d/functions the RedHat guys have already stuck in
Whilst working an AMAZING NPM repository mirror yesterday (which totally works, despite not really offering the performance benefit I’d hoped, because NPM is rubbish) I came across this error whilst doing things
16 http GET https://localhost:5984/registry/_design/app/_rewrite/-/all/since?stale=update_after&startkey=1371737164294
17 http 500 https://localhost:5984/registry/_design/app/_rewrite/-/all/since?stale=update_after&startkey=1371737164294
18 error Error: insecure_rewrite_rule too many ../.. segments: registry/_design/app/_rewrite/-/all/since
18 error at RegClient. (/root/.nvm/v0.8.15/lib/node_modules/npm/node_modules/npm-registry-client/lib/request.js:259:14)
18 error at Request.init.self.callback (/root/.nvm/v0.8.15/lib/node_modules/npm/node_modules/request/main.js:120:22)
18 error at Request.EventEmitter.emit (events.js:99:17)
18 error at Request. (/root/.nvm/v0.8.15/lib/node_modules/npm/node_modules/request/main.js:648:16)
18 error at Request.EventEmitter.emit (events.js:126:20)
18 error at IncomingMessage.Request.start.self.req.self.httpModule.request.buffer (/root/.nvm/v0.8.15/lib/node_modules/npm/node_modules/request/main.js:610:14)
18 error at IncomingMessage.EventEmitter.emit (events.js:126:20)
18 error at IncomingMessage._emitEnd (http.js:366:10)
18 error at HTTPParser.parserOnMessageComplete [as onMessageComplete] (http.js:149:23)
18 error at Socket.socketOnData [as ondata] (http.js:1367:20)
19 error If you need help, you may report this log at:
19 error
19 error or email it to:
19 error
Visiting that URL in a web browser gave me
{"error":"insecure_rewrite_rule","reason":"too many ../.. segments"}
This is because secure rewrites are enabled! Looking at my couchdb config this occured in the default.ini
secure_rewrites = true
so in the [http] segment in the local.ini file i set it to false, in your face security model!
secure_rewrites = false
Then i restarted couchdb, and the world was put to rights and the error went away.
Pat Flynn over at Smart Passive Income has just announced the launch of his Niche Site Duel 2 project, and as I kind of called him out a few month ago in my first (stalled) Income Report causing a WordPress ‘ping back’ and his mate Blake to pop in and say hi 😀 Rather than be a massive cynic, I thought I’d give them another dofollow backlink and join in with his new project!
But Why??
Ok enough jokes, I’m sure Pat’s lovely and I’ve actually got mad respect for the results he got with the Insanity work out. (OMG so much link juice …). I’m doing this because I’ve tried, and failed, to monetise idimmu.net. I don’t really mind, I enjoy writing about burgers and logging the dumb stuff I do at work so I don’t forget about it, this site was never a project to make money, it was started, quite literally, to be an online memory replacement service due to my inadequate brain, and more often than not, I do actually Google myself in order to remember how to fix iptables or setup ldap!
So, sure I like the idea of some passive income, and I like to do things with other people, so rather than strike it out on my own I’m going to join in with Pat’s challenge! Although I’m gutted I’m late to the party so can’t be part of his Mastermind Learning Group (TM).
WTF Are You Talking About?
Ok so here’s the thing the name of the game is this:
Pick a keywords
Create a niche site around it
..
Profit
It reminds me of the underpant gnomes.
Patt’s mentioned a selection process here for his keyword, which I’ve followed!
So, What Is My Keyword?
I’m not telling you that, I’m also not sending it to Pat 😀
Pat’s considering using best minivan as his keyword, which is currently ranking this in the #1 spot for me!
However I will tell you this, I’ve actually picked 2 keywords, because I’m actually going to make 2 sites at the same time! Both keywords are in different niches and are totally awesome. I’ve registered 2 domain names and I’ve created a BlueHost account to host them!
As a proof of work I’m going to mention some hashes in a random order, 2 of them are for the 2 domains ive bought and 2 of them are for my keywords! Random, different, salts have been used for each string just in case someone wants to hire the entire of Amazon’s cloud service to brute force them!
We use Node.js a LOT, which means we do npm install a LOT. And npm is pretty terrible, with horrible dependency handling so we can end up requesting hundreds of dependent modules with it’s recursive patten e.g. for just one of our projects we can end up with paths like
There are 59 instances of the mocha module in the dependency chain, how is that for terrible reuse of code! Why can’t npm be nice like every other language out there, e.g. perl (hi cpan), PHP, Ruby (hi gems!) and Python??
npm does cache locally, but it kind of sucks.
Anyway, rant over, we want to create a mirror of the npm repository to mitigate periods of npm outages (occasionally it does have them) and hopefully speed things up a little bit, so here’s how I did it!
CouchDB
All the NPM data is stored in couchdb, I’m doing this on Centos so I’m going to use yum to install couchdb
Total download size: 1.1 M
Installed size: 3.0 M
Is this ok [y/N]: y
Downloading Packages:
couchdb-1.2.1-1.x86_64.rpm | 1.1 MB 00:00
Running rpm_check_debug
Running Transaction Test
Transaction Test Succeeded
Running Transaction
Installing : couchdb-1.2.1-1.x86_64 1/1
Verifying : couchdb-1.2.1-1.x86_64 1/1
Installed:
couchdb.x86_64 0:1.2.1-1
Complete!
Simples! Next step is to start it, confirm it’s listening on a port and test it works!
Now we need to tell couchdb that it needs to replicate from the NPM master in a continuous fashion, so as the NPM master updates, so does our couchdb instance!
And we’re off! You can interrogate how the replcation is doing by visiting the server with a webbrowser at https://hostname:5984/_utils/ it should look a little like this:
Eventually it will stop growing, I promise 😉 As of writing it’s just shy of 50GB
Configuring NPM To Use Your Mirror
First we need to install some random npmjs stuff in to our couch database
Whether you are looking to open your first e-commerce store, or a better deal on your existing one, the e-commerce arena is a minefield that needs to be navigated carefully. Businesses usually always end up paying much more for hosted e-commerce solutions than they originally expected because most e-commerce platforms are not completely upfront about their pricing model. Problems can range from fixed, long term contracts, hidden transaction fees and tiered pricing, as well as frequent network issues and unpleasant customer support. If you accidentally choose the wrong shopping cart it can end up being an expensive mistake, so you need to take the time to make sure you make the right choice.
Volusion is one of the most popular SaaS (software as a service) e-commerce solutions that quickly lets you set up a store and sell your products online. Along with ready-to-go templates, website, store and hosting in one and very low prices, Volusion offers the tools you need to customize your store and grow your business. They also offer a completely free trial for you to try before you purchase.
Volusion is a complete e-commerce shopping cart platform that offers hosting, an online store and a website with many different themes and options, no setup costs, no hidden transaction fees (very important!) and one of the lowest monthly prices in the industry.
I’m giving Volusion top marks as my experience with them has been phenomenal: 9/10.
Volusion Features
Volusion is an complete e-commerce solution, including a shopping cart and website that helps you quickly set up your online store to sell your products with the ability to customize your store, choose from over 120 free templates, organize your products, accept credit card payments and track orders.
Adds to shopping carts keeps customers on the product page to reduce abandoned carts
The deal of the day feature will help promote your products and increase sales
Built-in social sharing to 25- different networks will help customers spread your products virally
Create coupons and discounts to entice new users and bring back old ones
Customers can review products for real-time feedback
Unlimited product options with complete product options.
Volusion’s SmartMatch technology keeps track of your stock status, tracking unlimited combinations of product options
The mobile-optimised website helps you reach, and sell to, more customers
The built in Customer Relationship Management tool helps support your customers easily and efficiently
Process orders quickly to view and approve orders in a moment and get real-time performance data on your business
Volusion provide free 24/7 customer support
Also sell items on eBay, Twitter and Facebook
The product comparison tool lets you show customers multiple product details side-by-side
Built in emails and newsletters
Over 120 free, great looking templates
Showcase products with vZoom so shoppers can zoom into product images
Volusion Review
Monthly Fees
Volusion has some of the lowest monthly fees among all popular e-commerce shopping carts. Starting from $15 a month the Mini Plan includes up to 100 products, a Facebook store, 1GB of data transit, social tools and a mobile store. Subscribing to the slightly more expensive plans, such as Bronze at $35 a month also gives you more features, increasing the product range to 1,000 and doubling the data transfer to 2GB, along with adding Abandoned Cart Reports, ratings and reviews from customers and newsletters. The $65 a month Silver plan gives you 2,500 different products, phone orders, the ability to import/export in to and from the Volusion system and a fantastic CRM (Customer Relationship Management) tool.
The most expensive plan at $125 a month is the Gold plan which as well as increasing the number of products as well as the data transfer, it also offers improved customer service from Volusion. The extra features include the Deal of the Day tool, API access, MyRewards, eBay integration, batch order processing and your very own account manager.
For the big boys there’s also the Platinum plan at $195 a month which offers you unlimited products!
Volusion Setup Costs
One of the fantastic things about Volusion is that they do not charge any setup fees! Their admin interface is extremely easy to use and they include a tutorial which completely guides you from start to finish, from choosing your store’s template and design to adding your first product! Their software is user-friendly and fairly intuitive.
Platform Customization
With the Volusion platform, you have the ability to add extra features to your store via the Volusion Exchange! Allowing you to add new features from their quickly and easily.
Volusion Template Options
Volusion has over 120 different templates to choose from in different industries and colours, so you’re sure to find one you like, the majority of which are free! Some of the premium templates from other platforms, like BigCommerce can cost between $500 to $5000, which is more than 1 years worth of subscription! Volusion’s main focus is on product presentation, they offer a fantastic user experience in order to generate higher conversion rates than the competitors by providing features such as product comparison and vZoom image resizing!
Transaction Fees
Volusion charges NO TRANSACTION FEES!!!!! Ok, the bold and all caps might be over the top, but I want to stress this. On every Volusion plan, from the cheapest to the most expensive, there are no transaction fees at all, this is going to save you a TON of money compared to their competitors. As well as offering their own credit card processing service they also support many other credit card processors if you already have your own from a previous online store or brick and mortar premises!
Discount Coupons For Everyone
With all plans, Volusion allows you to generate and configure coupons and discount codes for your customers with tons of different options including multiple purchase discounts, free shipping, 10% off coupon codes etc.
Mobile Store Front
One of the other perks of Volusion is all of their plans include a mobile store front allowing a much greater reach and size of potential customer base! Optmization is not available for every single mobile device yet, currently iPad users will just see the normal storefront, but that’s ok as they’re basically computers 😉 As well as a mobile version, with Volusion you can also put your products on Facebook with their Social Store service.
Volusion Review Summary
Volusion’s service is professional and affordable, making it a fantastic option for business owners of any size. Their range of free templates is impressive and there are no hidden costs, Volusion is an overall great choice for any e-commerce shopping cart, and used by many large companies already, e.g. 3M, National Geographic and the Chicago Tribune.