Andrew Breese

Musings of a professional geek

Sync Calendars between Lotus Notes, Outlook, and Google

For a while now I’ve been trying to sync my various work calendars, which run on three separate disconnected platforms: Lotus Notes (my current site), Outlook365 (my office), and Google (personal and phone). After trying an open source solution InGoogleCalSync which did half of what I wanted, I found a paid service called AweSync which is darn good – and worth talking about.

Essentially AweSync allows the calendar entries to be sync’ed both ways between Google’s calendar and Lotus Notes. It also syncs tasks and contacts, but I’ve not turned on that feature. The app is clever enough to understand that I have multiple Google calendars, and manage changes between them. The open source app was limited to one-way push, and events could not be edited in both places reliably, but AweSync handles this.

The Outlook calendar sync comes from the boilerplate MS Exchange config on the iPhone which supports contacts, tasks, events, etc. This means that events from my company are two-way sync’ed between Outlook and Google in one calendar, and events from my work site are sync’ed from  Lotus Notes to Google in another, and my third Google calendar is for personal information. I can see all these three sub-calendars now in Lotus notes, can tell by colour which is which, and also see an exact match on my mobile phone. Just like it should be when we try to have a central place to manage meetings and appointments – this small app and the darn large behemoth called Google have provided what I needed.

Awesync was a mongrel to setup due to the locked down permissions on my site computer, which essentially needed to be opened up so that it could run properly. The Support team from Awesync were wonderful, and it was their diligence in assisting me that really helped me decide to purchase it (USD$20).

So I still absolutely loathe Lotus Notes as an email and calendar application, but at least I can now manage my appointments properly.

xkcd’s ContextBot comic

Managing backup, cheaply

For years I’ve had no real backup strategy at home. Files saved into various cloud providers like DropBox, Google, etc are one thing which might appear to be backup-ish but they really are not a backup tool at all. Those are file sharing and synchronisation for the sake of easy access. In other cases the family was using a combination of removable hard disks and usb memory keys. In our house different files are in different “cloud” places, which makes finding them a mess and managing it difficult.

I decided that my holiday mini-project was to get a backup solution for the house.

Goals and considerations:

  • Applications should have a simple interface. Be usable by a non-geek to back-up their personal files.
  • The ideal solution should have multiple backups locations and hopefully different styles of backup. This means copying the same data into several different places, and consider using cloud or NAS or portable drives in combination.
  • Ideally be automatic, or at the very least able to be scheduled or controlled.
  • Be something my family can use, and something that I can maintain without too many headaches.
  • Due to the size of my data, a cloud solution is problematic. However some of my family might be able to use a cloud service, as they have a much small amount of data to backup. eg. My core set is around 180 gig with a fair amount of extra stuff I’d like to add to it, compared to my partner’s data which is around 12 gig.
  • Backup system should perform incremental backup, especially if this is to run over slower connections; like broadband or wifi. A very large part of my home’s data does not change often at all, and I want the software to alter the backups it has already made, not do the entire data block again.
  • USB memory drives are now hated and won’t be used as any type of backup. They are too easy to loose and fail too often.
  • Support for Windows is mandatory, anything else (Macs and Unix or blah blah phones) is a nice to have. I’m not a SysAdmin anymore so I no longer have a linux server chugging under my desk.

So in a bit more detail…

Simplicity is needed as I’ll most likely be configuring this, but in the event that I need to recover the data from a back-up I want something that I can walk somebody else through. I’m also likely to be called upon to “fix” a backup related issue on short notice and with a time deadline, so something that performs most of the work auto-magically (I hate that phrase but it suits) is important.

Multiple backup styles is important because any single style of backup might be corrupted or fail, so having several redundant techniques is a stronger/broader approach.

Initially I’ve chosen the CrashPlan application, as it allows a very simple to configure and run interface, which is suitable for non-IT literate users, and also has options that are reasonable for IT geeks. I’ll be testing this in a local setting (phase one below).

I’ve read there are solutions with easier interfaces and potentially slightly cheaper, but from my short scan of the featureset on the top 10 vendors, only CrashPlan has remote pc to pc backup which is independent of a cloud solution. I really like this idea when I can consider how a family might interlink their computer resources to backup each other’s data.

CrashPlan is also interesting because the basic backup tasks of saving a set of profile related data to a removable drive or other computer is really easy and free. Yup, free for the basic backups. There are a wide range of backup application vendors and especially cloud backup providers now (Backblaze, Carbonite, Crashplan, etc), so do a little digging to find the vendor that suits your needs.

Lastly before I get to the detail, a note about “the Cloud” and “cloud backup“. To be frank in Australia a typical USA cloud solution isn’t viable or fast. I love the cloud as a concept & tool, and liked the idea decades ago; before server centric computing was morphed by marketing jargon into calling it “the cloud“.

Downunder we are still not able to access cheap fast cloud services, and that kills what I’d really like to do – which is just pay a monthly fee and backup every damn file, quickly. I tried it with a few providers and the speed just isn’t good enough yet. Try moving 160gig up into a cloud provider and you’ll have a task measured in weeks! Time will change that eventually.

The rest of the article is about what I’ve got working and what is next.

Phase one is getting the solution working at our house, across the important computers. Not hard, a bit time consuming, and needs to be watched over a few weeks to know for sure.

Phase One – Get it working locally.

This means get the backup working locally to the home where the PCs are used. This was a simple task of installing the application on each PC and then pointing them at a backup location. e.g.

  • Laptop 1 is an older unit with very little free hard disk space, and also has a very small backup amount (12 gig). So that unit pushes data to Laptop 2 which has far more free space.
  • Laptop 2 needs to save about 180 gig of data somewhere, and has a local disk with 250+ gig free. So it pushes it’s data to a removable drive.
  • Ideally both computers would have a fair amount of free space and they could backup to each other, but such is life. When that older laptop is retired then I’ll use the new one in the same way.

This has been in place for a week now and seems to be working. I’m watching with interest how the different systems connect to each other, how much bandwidth they demand when running, and also how much cpu the solution churns when running and in background.

My hope is that the backup solution will “just queue a retry” when the appropriate destination isn’t present. It seems to, but time will tell. A month is reasonable shake down time.

Interesting and useful that CrashPlan sends a periodic email summary to me of where it is up to. I like that.

Phase Two – Get it working with a NAS, where the NAS is the destination for the backup.

Specifically, I want it to backup onto the NAS, and I’m not at all trying to backup a NAS. Backing up a NAS via a Windows program running on a separate PC, across a wifi network, … is a nightmare scenario with too many moving parts. Most of the articles linked below are addressing how to mount and backup a NAS drive to elsewhere using the CrashPlan client.

Instead I’m looking to use the NAS space as the place to drop the backups. CrashPlan does not do that easily either, but the work around for  cheating with a NAS pointed me at a working unsupported solution.

Aside on NAS backups – CrashPlan does not support backup to a NAS for Windows (macs and such work fine, as it is a limitation of the way Windows OS handles services running as users and the security permissions), so yes it can go between PCs, but those PCs have to have CrashPlan installed and my NAS is a more generic consumer media drive with no WindowsOS to use. This is a limitation that hurts the product for Windows systems.

There are unsupported known work-arounds though (have a read here), which is where this step comes in. I did not use the advice in the linked article, as it uses the “Net Use” command in a windows batch file, which I don’t see as a useful approach for Windows 7 and Windows 8. After too many years using batch files to bend Windows to my ever changing and unforgiving will, I now avoid it.

It certainly might be ok in WinXP, but thankfully I only need to worry about Win7/8.

Instead I’ve used an NTFS Symbolic Link (which is akin to a shortcut, but not), however the operating system sees the connection as present and working file or volume. Essentially the user will see a folder which looksalmost normal, and then when they open it the current user session credentials are passed through to the NAs and the NAs folder will show up.

This is useful as it disguises the network share as a directory local to the computer’s operating system, which can then be used by CrashPlan for backup. It also establishes the link as something which is persisted to all parts of the OS, which means that the authentication for the connection is no different than any other share. This is handy because the batch files would have required authentication details saved within them, which I conceptually detest.

Still to do -I need to double-check that the Windows indexing service and all the associated scanning services (anti virus, etc) are set to ignore that symbolic link. I don’t want the OS to manage that area at all, just point to it.

The approach is predicated on the NAS server being online and present when CrashPlan needs to run, but as do all the approaches (duh!); you’ll always need the location to be online.

The mklink command syntax needs to:

  • run the command as Admin, which on Win8 might be a UI challenge in itself. Find it in the GUI then right click to run with elevated permissions.
  • use the command switch to indicate it is a directory, using “/d”.

mklink /D C:\temp11111 \\server\share\foldername\

After adding the link and then configuring CrashPlan on my laptop to point to the NAS – it backed up.

That was an exciting nerdy moment for me as it meant that my files were now backed up in two places. A huge tick in the redundant locations requirement. This is in place now too, although I do need to make sure the local network isn’t being crushed by all the traffic. The backup app kicks off automatically, so it could gobble up the local network unexpectedly.

Still to do – Last trick I’ve yet to do is to investigate and configure the advanced setting to disallow it run during the “busy times”. I think I’ll set a schedule from midnight to 6am where it can run unhindered.

Phase Three – Get it working cross-site.

The last step is to get an additional redundant location which is also offsite.This phase is still very much in the playground stage.

The product’s cloud solution could do this, except that it initially said the data upload was going to take 4-6 weeks continuously running! Nope, sorry. I want a backup somewhere which isn’t in the house so that theft and fire can’t affect the backup and until the online speed improve I’ve reverted back to using portable hard disks.

Aside for investigation later – There might be a way to seed a backup set onto a portable drive, move it to an offsite location, then copy that backup to the NAS or some such. I’ll dig in to that later. Might be useful for both the CrashPlan cloud backup, or even backup across the VPN between houses.

This is the tricky and complex step, as it involves setting up a VPN into my home network and then configuring the laptops to recognise the NAS as a valid CrashPlan backup location. It has the flaw of affecting both my home bandwidth and the bandwidth of the other end, and also will chew up a lot of my family’s internet allowance each month. The usage is certainly something that I’m not sure about.

There are two options, either I use my account and backup between my computers and/or I setup an account for each user and cross link the machines. Hard choice, so I’m doing both.

e.g.

  • My laptop will use my NAS, and also a removable 2TB USB3/SATA drive. I might throw a 3rd local network location in too while I figure out an external location.
  • My partner’s laptop will use my laptop, and also the NAS.
  • My mother’s two laptops will use each other (locally) at her house, and also share files into my NAS via a VPN.
  • My brothers family of laptops (4x of them) will backup onto each other in whatever mesh makes sense according to their disk space and backup size, also onto his removable hard drive, and then onto my NAS via VPN.
  • I’ll consider a family Cloud backup plan at this point too, to see if the backup is worth it, but frankly the speed from Australia isn’t great.
  • I’ve love it if each of the three sites all had a NAS which then swapped backups, but that is a dream for a more enlightened time. Maybe next year.

Another interesting factor in the choices is which VPN type to configure. PTPP is quick and simple, but not terribly secure anymore. L2TP is better (far better I’m told) but requires setup of keys, then OpenVPn is great and I use it a lot personally but I’ve never set it up before from the “end” side.

I got the PTPP styel VPN working at home with my router as the end point in a few hours, opened a firewall port, and then closed it all up again after reading more about just how hackable PTPP is.

And this is where I am now, muddling around with reading about OpenVPN vs L2TP config, and how that might be performed on my router, or more likely through my router straight to a host (yikes!).

It’s nice to have a technical challenge hobby project again.

Last comments

I’m feeling far happier with just a basic backup being performed automatically regularly at home. Even without an off-site solution this is far better than nothing. The VPN questions will take a long time to get solved, and likely be a tech support challenge for the other users.

I wish I’d purchased far higher end gear for my media server and the hardware in the laptops, as it would have made this a little simpler, but my choice at the time was to limit cost and not be doing SysAdmin tasks at home. It means the next set of hardware will likely be mid range gear, and avoid the home user stuff.

…So if you are after a home backup solution consider CrashPlan’s free offering. All you need is either some diligence or an extra pc in the house to swap backups with; and the configuration is not hard.

Is the real fallout from the Sony attacks yet to come?

A cancelled film, a company worth of hacked and destroyed computers, stolen personal data, stolen company records, and fear mongering is the first phase of the attacks on Sony. Whoever coordinated this series of attacks has played a very good game so far, and continues to use FUD (fear, uncertainty, and doubt) to channel almost everyone’s thinking. In the face of real violence most people will prefer to play it safe.

I know that I’m certainly not going to do anything to endanger my family; which is where this first phase of the story ends. I’m a little fearful, and I’m watching the news and paying attention. I’m aprehensive about writing a blog post. Good game. You’ve won phase one.

Are changes needed? Perhaps the extreme withdrawal and kowtowing  by Sony is a move by the mega-corp against another of Sony’s dark foes – internet piracy.

The next phase worries me far more because it won’t be about this attack, it will be about the changes that our governments and companies try to introduce to protect us. I’ll be very surprised if governments and corporations don’t wish to further change laws based upon the fallout from these events. Perhaps on the first thought they might be right; some changes are probably needed – Sony was hacked wide open and they have a huge amount of things to fix and recover from. The financial and reputation cost is non-trivial. Geek-types such as myself might have a love-hate relationship with them due to various opinionated view on consoles and games, but the general public think of them as a big movie studio. And that studio just got slapped very hard.

I think we are about to experience in a wider context is these events used as a further strengthening of the arguments for a regulated internet. In my own view the severity of the attacks were escalated when the hackers threatened to do something to the people who when to see The Interview, and then Sony gave permission to withdraw the movie from cinemas. I think this changed the way the public viewed the events, from a company being attacked to limit their profits, to a threat to joe-average-punter.

It is a conflict targeting the balance between our fears and our freedoms. And when the laws are changed to protect against the phase one events it could be at the detriment of our wider freedoms. It is a delicate balance with no perfect solution, but many bad ones.

A superficial rationale is: hackers and their nefarious tools did all this, then (insert country, company, mother’s name) needs to be protected.

I was never going to see The Interview so selfishly I wasn’t fussed that a studio yanked it’s release, or that Sony got hacked in the first place as I wasn’t affected. I am going to be on the internet over the next 40 years, so I am concerned about how much leverage this type of event gives governments to make sweeping changes. Australia (my home) has made changes in law to the powers of police in reaction to both terrorist threats and cyber-threats, and certainly already has some very powerful and uncompromising anti-hacker and anti-terrorism laws.

Yes, I’m possibly wrong too.

Perhaps Sony won’t apply it’s huge financial loss and damaged reputation as a stick to beat the American government with. Perhaps they won’t use FUD to push an anti-hacking (bit torrent, dark-net, etc) agenda any more than they already have. And it’s unlikely that America will be able to directly change Australian internet freedoms soon, … except that most laws past in the US also impact big sections of the internet, and Australia is well known to mimic and support American interests.

(aside – don’t misunderstand that point please – on principal countries working with a unified policy is interesting and can be valuable).

Read the backlog of anti-piracy material from film studios, an the fear around one govt attacking corporations in another.

And then please read widely on the freedom vs security as it relates to both the internet and your rights as a civilian.

And make up your own mind. I’m uncertain, but the FUD is working on me  today. The Sony Hack story is far from over.

Thoughts about Apple’s recent batch of news

It is almost like Apple has a Deck of Many Things, and got all the bad cards at once.

They’ve had a seriously large wave of mixed and poor press, which can’t be helping their position in the marketplace. That said, it feels more like a series of unconnected issues which could have affected any tech-giant. I have to wonder though if they’re spread too thin. MS has had months like this too and folks still use their gear every day, and the laundry list of odd things by most tech-farms reads a similar way – although not as many issues compressed into such a short time frame. We’ve seen:

Apple’s new iphone 6, which people have gone crazy for…

  • I’d love a faster processor and more ram in my old iphone 4. I’ve seen the speed difference between the 4 and 5 and it is dramatic for the same apps, so I can only think that the v6 will be faster too. Add a little OS bloat and App bloat and perhaps the speed difference will fade over time, just like it does in our other computers.
  • I’m certain that IOS8 will be a better operating system than v7, just because it is where they will spend their dev budget. I don’t look forward to the end of life for IOS v7, but it is coming very soon now that the sales have been so good for the new toy.
  • There is a comparison between the Galaxy 5 and the iPhone6 variants which all but says the Galaxy is the same feature-set. I’m not sure that is true give the hardware specs, but the high level feature-set looks darn similar.
  • Larger phones do not appeal a lot to me, because I want a device which is a phone first, then a pda, or tablet, etc. I need to be able to carry it without it being obtrusive, and hold it in hand easily. It needs to fit easy in a pocket, and there is no way in hell the new huge iPhone will fit in the pockets of my wife’s jeans. If Apple made a smaller but faster iPhone that the v4 form factor, or even just slimmer I’d really consider buying that for both of us.
  • I’d also pay for significantly more battery life. Not just an extra few hours, I mean give me a week between charges like the bad old days in the 90s mobile phones.
  • My iphone v1 (which couldn’t be purchased here in Australia) still works fine. I now use it as a music player in the house for one of the bedrooms. Yes, it needs to be always on charge, and it only holds a small amount of music, but it is doing great.

Apple’s iPhones 6s can apparently easily bend in your pocket…

  • If true, sheesh. Smaller tougher phone anyone? Can somebody rush a titanium laced backbone phone cover for the damn thing and sell millions of units…

Apple release a new privacy statement, which might take a broadside at Google

  • It reads well if you don’t think they’re being smart-arses. But only when you consider that it might be snarky, it reads like they are being rude. To be frank I’m not convinced that this was rude at all.

Apple has the (perhaps hack) issues with icloud file security.

  • Blaming the end user is pointless and exploiting them is horrible.
  • Anything and everything can be hacked given enough time.
  • I do not use iCloud because I do not trust ANY of the cloud services (yet). If you want my data, then come to my home (or my offsite back-up server) and get it from my cold dead hands.
  • And they are addressing it.

Apple gift the U2 album to users as a purchase, creating a storm of negative press…

  • Giving away free music from U2 should have been a huge promotional boon, but the auto-purchase rather than opt-in shows that the strategy didn’t consider the negative response at all.
  • Ironically you need to have an iCloud account to get it, so I can’t grab the album. This is where I’m OK with the choice to not use iCould – yes I miss out on the features and offers they are making, but I also know that my data is still mine. That approach adds risk of loss from theft and hardware failure, but I manage that risk through my own back-up.
  • I would like to listen to the album, but not really fussed, and certainly not going to pay for it.
  • I am sure that it will be available on illegal sites straight away.

Apple IOS 8.x Update rolled back due to lost calls and a few other issues

  • New iPhones have an issue with the 8.x update, and rollback has been sent out. Youch.
  • “If it ain’t broke, don’t fix it”. My advice is to never apply an update to your devices (laptops, pc, phone, etc) in the first weeks of it’s release. Let somebody else be the tester and suffer any angst. This has happened to so many devices that only people with issues that are directly affected by the patch should try it.

Apple’s OSX might be affected by the Bash exploit

  • The exploit could affect a variety of Linux based operating systems, but the method of attack is very specific, so isn’t likely to affect most users; especially everyday end users. Don’t panic.
  • The potential for the exploit has been around for decades, and this is something to patch and not something to panic about. The sysadmins will know which systems are at risk and they’ll be stressed enough for everyone.
  • get the systems patched and buy yourself a coffee for being on the ball. Don’t buy into the hype.

Apple is still the device platform I use at home, and unless somebody releases a cost effective way to alter that and maintain the bread of easy in controlling the devices that won’t change soon. I’m locked in for now.

They need a weekend on a beach somewhere to chill, then return to work refreshed. Poor bastards.

Neuromancer is 30 years old today

SoyletNews gave me a great tidbit of random trivia – the novel Neuromancer by William Gibson is 30 years old today. I remember reading this book when it came out and being totally dazzled by the concepts (don’t guess my age please, it’s not polite). Gibson wrote substance which resonated for decades, and is still pseduo-relevant even after so many other fantastic authors have launched further from his base.

Thank you Mr Gibson, the work is darn appreciated.

a bit more haiku malarkey

Here are a few more haiku, pondered while I was trundling home on the train. You sometimes go to strange places when you’re breaking the world into segments of 5-7-5. I’m not sure if there are also supposed to be titles for poems like this, so some have them and others do not.

Can I gantt this?

The office is calling
Tomorrow’s due date is past.
Deadlines are like that.

We’re always recruiting.

The office is calling
Your team is halved again.
We are here to help.

Am I a spy?

The office is calling
We know you are tired and cold.
You need to come in.

 

That isn’t English?
Tell me who understands you,
they’re a living saint.

 

Meeting tomorrow.
Work up to the 13th hour
And it’s a Friday

 

Servers down again.
Don’t they know it’s past midnight?
Let’s ring the PM.

 

Time scope cost mantra.
We meet to raise productivity.
Is that irony?

 

where did the time go?

You must record time,
Liar I don’t trust your times.
Are these bills correct?

Well folks seem to like (bad) PM haiku

I tweeted* a project management haiku recently and my twitter traffic went through the roof – well as through the roof as a change from zero tweets to one tweet can create. I think this is part of the reason why social media carries so much weight – it pays into the stimulation response we get from having something seen and quasi-appreciated.

Like a good doggie, I’ll do it again shortly and see if I get another biscuit**.

Lets not plan for giving up the project manager day job, as it would mean no more snarky tweets about being a PM and I’m under no illusion as to the amount of banter and wind already through into the digital wind

* I really dislike that word as a verb relating to posting content online. Birds should keep this word to themselves, and rise up in feathery rebellion against the human’s technology. Like a Planet of the Apes spoof where all they do is poop on our tablets and eat the phone lines. Rebellion! It is what it is, and the word won’t be changed till twitter dies.

** Yup, it’s Friday down here and I’m feeling tired and strange. Back to the geeky blog posts shortly.

a random haiku

I’m trying to …

A guessed budget
Scope is a little too large
What have we left now?


A bit of banter at work found that a few of us project managers like haiku. When done well (better than the above by half) they cam be a wonderful source of inspiration and calm. This one is a meant to be a bit of an odd riddle too, so try to guess what the title means in context with the haiku itself.

Yup, its a bit wanky.

Spoilers about the answer after the break.

 

Read more of this post

What are the eBay hack implications for ordinary people?

So eBay was hacked pretty badly (Soylent post, and eBay’s own announcement), which for the geeky type folk is interesting in terms of how they did it; but more importantly – every eBay user should take care and think about what they have shared with these companies.

The nature of the hack is typically complex, and the explanations of how are not really that relevant to every day users (I’m not being condescending by saying that, as I consider myself an end-user too). eBay has made a fair effort to give details recently, but can also be rightly questioned about how long the advice took to reach their customers. Rightly or wrongly it is around 2 months since the actual event, and that is a fairly long amount of time to wait to request password resets.

  • For users it means change your passwords now. Change them to something which is hard to guess; preferably a gibberish nonsensical combination of letters, numbers, and special characters.
  • Also change any passwords for services which you used with ebay too.
  • And please ensure the password is not the same as any other services you are using.
  • Consider using a password vault of some sort. It makes having all these different password easier, and also helps you when using multiple computers. Yes, it can be hacked too, but everything can be.

The reason for this is we don’t know what other attacks are being attempted, or what previous attacks might not have been understood well. PayPal is certainly a regular target for trouble too, so consider altering your credentials with them as well.

The tech jargon translates to mean that your password wasn’t easy to read, but it is only a matter of time until it is. Database encryption is a wonderful thing, but time and brute force will beat almost anything in use today; and by comparison the “safe”encryption methods of five or so years ago and now considered questionable.

Of the 140 million accounts compromised, wouldn’t you rather be one of the ones that isn’t open when the hackers decrypt all those old passwords?

Is this an alarmist approach?  We’ll no. This time resetting your credentials is the first step.

There are additional steps “normal” online users should do too:

  • Consider changing what personal information you are currently saving into and sharing with every online service (a.k.a. website). Each app, each URL, every vendor, all those games, and whatever Facebook widgets all collect information from you and you’re far better off if the collection of information out there is as vague as possible. I recently went through Facebook, eBay, paypal, and a few other services that I use and removed a lot of personal information. From now on they only get the minimum.
  • As part of that depersonalization, consider getting your orders delivered to a place which isn’t your home address. I often use my work address, as there will always be somebody there during postal delivery hours, but it also means that eBay and such have no idea of where I actually live.
  • Don’t trust your app vendors and more than you’d trust the guy at your local clothing store, cafe, or petrol station. Just like it is trivial for somebody working in the store to grab your card number, it is a lot harder but also a lot easier to do in huge quantities for online transactions. This means that while you’re generally more secure online for a single transaction, the methods of attack are far more complex and harder to understand.

Hope this was useful, and also wish the darn hackers would get into something a little less destructive and nasty. The skills to do some of this activity are significant, and there has to be a better way to gain money or notoriety.

 

Follow

Get every new post delivered to your Inbox.

Join 225 other followers

%d bloggers like this: