After 5 years it’s time to replace my Drobo with a Drobo (FS ->5N2)

I have been using a Drobo FS file store for over 5 years.  I purchased back in 2011/2012 and started out with 2 x 2TB hard disks (Which at the time seemed a huge amount).  Over the years I’ve added drives and replaced drives both because of failure and upgraded.  I’ve got from 2TB to 4TB, to 10 TB to it’s current state of 13.54 TB.

Screen Shot 2017 08 26 at 13 44 37

In all this time I’ve lost no data and managed to survive 2 drive failures. Obviously the reasons for buying a Drobo vs the likes of Synology or QNAP are well documented.  Whilst I would have  had to buy a QNAP system and 4 match 2TB disks all in one go, with the Drobo I’ve been able to buy disks at the time I needed them.  I checked the receipt recently and when I first bought the system my 2TB drives were just over £250 each.  Fast forward 5 years, and a decent 4TB drive is now just over £100.  This flexibility is just too convenient and leads to a near complacency.

So why am I changing?

Firstly, I’m moving to another Drobo and secondly I’m not replacing but augmenting.  My plan is to add one of the new 5N2s with 3 x Segate 10TB Ironwolf Pros and a 240GB m.sata accelerator SSD.  In due course I’ll likely migrate most of the FS data onto the new 5N2, but I will have a period where I run them side by side.  I’ve still not fully decided if I’ll sell on the FS and move over wholesale.  Certainly moving 15 TB (the contents of my FS & some USB attached drives on my server) will take a good few days to move- so I’ll maybe put that decision off until later in the year.

What’s new in 5N2

I’d looked at both the D810N (the 8 bay SMB system) and the 5N2 which, when I began my decision making, was very new.  I decided to go for the 5N2, after considering the facts and realising that the 5N2 would give me 40TB usable storage for under £2200 at today’s prices (and hopefully less as drives drop in cost).  Secondly the SSD acceleration and tiered storage is now available in the 5 bay format, which should make for much greater disk performance.

Drobo 5N2 m.SATA Accelerator.jpg

Coupled with the new dual gigabit network ports, and the latest generation Seagate Ironwolf drives with their 210MB read speeds, and I hope the whole system should be way more flexible and give me the capacity I need for the next 4 or 5 years.

Drobo 5N2 dual gigabit

Glasgow – Bars, Cafes and Restaurant – WiFi for the workers – I – League Table

I’m happy to see that Wireless internet is becoming prevalent in a lot of business in Glasgow.  I’m not talking about crappy 2 or 3 meg services, but useful speeds 20+ Meg.  I’m starting a league table for these.

Redmond’s of Dennistoun (currently #1)

304 Duke St, Glasgow G31 1RZ

Down: 93.2

Up: 13.7

Redmonds of Dennistoun - broadband

Redmonds of Dennistoun

 

 

 

 

 

 

 

 

 

 

St Lukes

Calton, 17 Bain St, Glasgow G40 2JZ (currently #2)

Down: 27.9

Up: 13.6

St Lukes, Gallowgate

St Lukes, Gallowgate

 

How to utilise a gig – Project 2: Rip and replace the router

RouterGraphics

The second thing you do after you get used to the sheer speed of a gig symmetric service is start to wonder why you can only hit 850/900 mbit regularly on most speed tests.  Shortly after that you wonder why you struggle to max out a torrent beyond 30meg/sec.  This was the strange set of events that led me to do a bit of a deep dive and find an unexpected gem.

So yeah, I went from 15-> 820meg speed tests, and yet the geek in me wondered, where had those other 180 meg gone?  Now, I give you that at these speeds its a purely theoretical, intellectual exercise, but nonetheless where had that missing 20percent gone.  Research quickly finds that around about 900 meg is realistic with overhead on any gig service, so we’re hunting 80-100 meg.  A bit of fishing around reveals that the slow torrent speeds are likely as a result of a router that can’t handle the number of connections, that and the fact that most soho routers (ZTE in the case of Hyperoptic) just can’t give you full line speed, port to port all the time.  One of the nice things about the Hyoperoptic service is the fact that it’s basically a RJ-45 jack with DHCP.  Plug in any router and it will come up. So which router?  After several reads on forums, I went for a Ubiquiti Networks ER3-Lite, I’ve loved Vyatta long before it was purchased by Brocade, and the reviews of hardware professed it’s ultra high speed, with low overhead, indeed you can offload most of the processing to it’s DSPs.

300px-Edgerouter-original-packageI picked mine up at amazon ( link to amazon.co.uk for ER3-Lite) at under £70.  Out the box it’s a pretty sturdy wee beast.  Once I’d ordered I started googling, and found two fantastic videos on setup , the best one is probably this one .

The device is pretty simple to setup if you understand networking in anyway, plug a cable into the eth0 port, you’ll get a dynamic IP (if not just set your ip in the 192.168.1.0/24 range) then http to the routers IP: 192.168.1.1.   Once in run the setup assistant (shown in the video) there’s various different ways to configure the 3 network ports on the device.  I went for WAN+2LAN, this makes eth0 the LAN1, eth1 the WAN and eth2 the WAN2.  I have a subnet for my lab so will eventually use this segregation to router VPN traffic in.

Performance once configured is pretty breathtaking.  I only tweaked the MTU on the LAN  port (eth0) as my home switch supports Jumbo Frames:

Heres a wee graphic of it doing a quick 861, though I can regularly burst 920-970 depending on time of day and location of end server.  In short- buy it!

output

Things to do with high-speed net – connected home

So still loving the new service, but as I said in my last post, I’m now interested in exploring and sharing some of the new facilities that high speed connectivity can offer.  I’ll start with a couple of housekeeping points

  1. From my selfish perspective, I’m referring and qualifying all these “what you can do” articles on the Hyperoptic product that I am lucky enough to have.
  2. Many of the outcomes will be repeatable on other products (Infinity / Virgin cable etc) however results may vary.
  3. The biggest differences are likely to come from the asymmetric nature of many of the competing services – you’ll need to check how the differences of your particular service affects the outcome.
  4. You’re going to want 40+ meg on the downstream and a minimum of 10 meg on the upstream to even bother with most of these services.
  5. 1meg upstreams need not apply!

So – first things first, what about using all this super cool goodness to build a kickass media server.  I’ll be going over the basics of that in the next article!

Journey to Gig/Gig broadband with Hyperoptic

I live in the centre of a major Scottish city, which has according to wikipedia has an urban population of 1.75 million people.  That’s no small number of people.  I just assumed when I bought my place, that being in a large new development that when infinity began rolling out that it’d be available pretty much after launch.  Well thanks to the mystery of BT Planning, I’ve sat for the last 5 years and watched as promised date after promised date has slipped.

Late in 2014 it became abundantly clear that BT had no interest in enabling us, all the vague excuses piled up and I realised it was going to have to be a DIY approach for our building.  First the facts- our building is just over 7 years old, is a steel frame building with inbuilt ducting and services and false ceilings.  All these things make it super easy to install services.  Secondly, the location and pricing of the building means that there’s a lot of young professionals, and home workers (i.e. a captive market).  To me it made no logical sense that the building hadn’t been enabled (I’ve actually come to get information that makes me believe it’s not a technical reason but a planning/political reason holding us back).

So i began my search, which in all honesty I was expecting to be fruitless.  I mean- if BT the largest telecoms group in the country can’t get us cable who can!  My first port of call was Virgin, where I reached out and got positive noises from they cable my street team.  I registered my interest, had a few neighbours do the same, and had good conversations with one of their outreach managers on twitter.  After 6 months though we had no committal, no in person engagement and weren’t going anywhere.

Frustrated I looked at the market again and came across Hyperoptic – a company which on paper our building was purpose made.  I have to confess that due to travel I didn’t make the first approach, but a neighbour picked up the baton and made the initial contact.  I got re-involved in summer of 2015, after which time I assumed the role of Hyperoptic Champion for the building, and pushed the project.  I started by joining our residents association, primarily for a single task (i.e. getting the fibre fitted) and began to work with the excellent John McCabe at Hyperoptic.  Our residents association really lacked social media skills, a domain name and a Facebook forum later I had the ability to outreach to folk and begin campaigning.

I started by speaking to neighbours I new in person, however it’s a big development so I probably only knew 20percent of the folks in the development. Hyperoptic require residents to register interest on their page, and have a very transparent tracker.  To assist I was sent marketing flyers and materials, which allowed me to do a mail drop.  I took the basic materials, and made them a bit more personal by branding and explaining a few things on an accompanying letter.  I mail dropped them in early September, and by the end of that same month we were showing the adequate number of registrations to move forward.

Hyperoptic were true to their word and surveyed the building, reporting back that they’d be able to fit the service with no issues.  The only fly in the ointment was the lack of service hatches in the ceilings outside the units. Hyperoptic offered a solution of installing these hatches and picking up the cost. (The truth being that we should have had these in anyway).  As is typical we had the doubters who though that these small hatches would “spoil the look” but to be frank I never accepted that given that we have smoke detectors, lights and so forth already there.  Anyway, a vote sent out by the factor saw no significant objection and we were green light to get the way leave signed to get Hyperoptic in and fitting.

The internal cabling is high quality Cat6e, I’m not 100percent sure of the switch infrastructure but effectively the fitted network should be able to support 10gigs and beyond (technology permitting).  Cabinets were installed in the basement levels where the switches are housed, and cat6E cabling was run first below in the carparks, and then up through the 3 blocks of the building (10 storeys).  The install is first class, to the point where in a straw poll of folks visiting my house I said- do you notice anything? – to which the answer was – what?

Fibre install has been a bit of a bear, with the contracts (BT) wasting dates, and making delays, however we have the fibre into the basement, and jointing is to go ahead.   I’m already confident this service will have massive benefits to our residents.  To go from a 14meg internet connection which is beginning to struggle to support multiple over the top media services, to a symmetric 1gig service is going to be a huge change, and I’m going to blog about how it affects the day to day of what and how we utilise media.   Stay tuned for more articles.

VMUG Advantage – a fantastic new resource for VMware study

vmug-logo

 

 

I’ve been in a bit of a refresh cycle since I had my recent promotion in my company.  Over the years I’ve attained a ton of professional certifications, and it was about time to make the difficult decisions, on what to maintain, what to improve, what to add and what to cut.  Having some time to myself to allocate to technical development, I’ve been keen to bring my VMware skills up to the new VCP6.0 level ( I have aspirations of takin this even higher) and top bring in desktop virtualisation, virtualised networking and orchestration.   Part of my challenge is that I’ve already done the classroom training, and have a ton of hands on for VCP510, but need to sit the technical exam.  So the questions in order are: 1. how to attain VCP550 2. How to upgrade 550->610 and then what I can build on in terms of NSX.  Secondary concerns are upping these core fields to higher levels.

I have a ESXi box in the house, an investment sometime ago means I have a Dell T620, with a health 12 x 2.5Ghz Cores, 5TB of storage & 64GB of RAM.  Nesting labs is relatively easy, but as I want to build a lab that will last , I don’t want to be affected by the constant 60 day nag message that come with using the evaluation version.  I’ve been googling and came across the VMUG Advantage, which on paper seems to solve these issues by:

  • Offering 365day evaluation licensing for all VMware tools including Sphere, Horizon, Orchestration and VSphere
  • Offering discounts on specialised training.
  • Forum access to a global pool of VMware professionals
  • VMware blessing.

All of the above for $200US, and to be clear that includes full legal licensing for the full VMware portfolio.  If you think it seems a bit too good to be true, you’re not alone, but I can assure you having signed up that this site is completely legit.  I wish more vendors took this approach to training for software.  The signup process is painless, though be warned that the time between signing up for the program, and receiving the logins and links is about 48 hours, as it’s a manually verified process.  Rest assured the support teams that administer the service are fast and really friendly- and yes to repeat, you get FULL LEGAL LICENSES!  in hindsight I wish I’d not signed up on a Saturday as I waited the whole weekend to be activated, but that’s a minor grizzle at something I did on my side.

I’ve now got a VCP6.0 box (Sphere 6.0 and ESXi6.0) and a nest lab running an evaluation version of Sphere 5.5 for my first training course.  Nicely, it means that I can practice the upgrade from 5.5->6.0 when the time comes, and at the same time evolve my box to my needs.  I plan to also go through the VCP-NSX 610 course – I qualify for this due to my CCNP and having attainted my VCP, again though, that’s a secondary.  I will be posting more specifics on my lab setup and experiences but thought the sheer brilliance of this offer just had to be shared- take it up- you will not be disappointed!.  I’ve also loaded the box up with a copy of Cisco’s VIRL tool with a 30 node license- an Openstack underpinned service giving you full access to to Cisco images, allowing you to out do GNS3.

I once heard it said that the best money that Cisco ever spent on development was creating their training curriculum, and ensuring that IOS images leaked on the web.   It’s all very well being able to buy a few cheap catalysts and a couple of routers and sit your CCNA or even your CCNP, but what happens when you need to self study DC gear, or advanced firewalls with 6 figure price tags?  Often  this lack of hands on results in paper passed students, who have GNS3’d the sh_t out of it but lack real world skills.  The only other option is piracy (which I don’t condone) but why I can understand.

By allowing students legal, affordable and full access to your technologies, you are putting in place the foundations for a successful support structure, both in companies, and in the wider market as a whole.  VMware (for supporting the VMUG advantage) and Cisco (with VIRL) should both be celebrated for taking these important steps into enabling driven students and individuals to attain knowledge of their products.

So over to you Citrix, Juniper , ALE and others

Getting more serious about a social footprint

I’ve decided to really get serious about a social profile again.  With continued interest (read stupidity) about maybe doing more than just having fun with a camera, I’ve taken the decision to invest in my extra curricular activities.  To that end I’ve picked up a copy of Mars Edit let’s see if it’s £27 well spent 🙂

Screen Shot 2014 12 30 at 19 34 14

Chrome – The Macbook Pro Battery Killer?

Macbook Pro RetinaI’ve had my Retina Macbook Pro for just over 3 months now, and during this time, it’s both impressed me and pleased me with it’s performance, good looks and battery life.  The other day I was reading up on the Apple support webpages for an unrelated problem, when I stumbled over a thread relating to poor battery life on the Retina Macs.  Interested I wondered what issues other people were having, and discovered that it seemed to be a widespread problem.  Many users reported that they were seeing only 3 or 4 hours battery life after full charge, when the quoted figure is 7 or 8 hours easily.  It hit me like a train: I’ve never seen battery life anywhere near as high.  Coming from an old Macbook Pro (circa 2008) I was impressed because comparatively my new Retina was miles ahead, but those 3 and 4 hour estimates set alarm bells ringing – did I have a shitty battery mac?

Luckily the support site had a couple of tips: zapping the PRAM was one of them, and secondly to kill off a corrupt desktop process.  I quickly bookmarked the pages for later reading, and set an alarm to follow up.  Back at home I ensured my Retina was fully charged – with the battery showing 100percent, I unplugged to go onto battery power and watched the power meter.  After a few minutes saying “calculating remaining time”, I got my figure – 3.5 hours!!!  To say I was dismayed was an understatement.  You don’t expect your £2 Grand pride and joy to have lost battery condition so quickly.  I followed up the bookmarks, and proceeded to zap my pram and wipe out the faulty process: improvement, we were now seeing 4.5 hours, but then witnessed this quickly fall rapidly down in 10 minute increments.

By now, this was really worrying me.  Having bought the Retina in Canada worries regarding warranty started to surface.  Where was all this battery going?  A glass of wine, and some thinking time followed.  I checked the activity monitor, but no processes were consuming huge chunks of CPU time…  Back to the basics, Layer 1: Physical, with virtually no applications running, the bottom of the machine was hot.  Checked a quick schematic of the machine, and the place that was the hottest was definitely where the CPU resided.  So how can CPU be running hard but no process consuming any CPU time?  Closed every application (even though only had Bit torrent, iTunes, Mail and Chrome open).  I could feel the machine physically cool, and the battery life start creeping up, 4 hours, 4hrs 20, 5 hours, 5 hrs 30.  So an application was causing crippling battery life.  In the past I’ve known iTunes to be a resource hog, so opened it on it’s own and started playing music: battery life stuck at 6hrs 25 minutes (on 75 percent battery) with no noticeable drop.  Opened Mail, again no noticeable drop, uTorrent is lightweight so no surprises, opened it and again no noticeable drop (even when hitting 1.5MB a sec).  Opened Chrome: BANG! battery started to drop like a stone!.

Google Chrome Icon

The evilest browser in the world

So what’s going on here? Chrome is draining the battery – I proved this by only having Chrome open.  Chrome closed: battery 6hrs 25, Chrome open for 5 minutes – battery showing 3 hrs 20 and machine running hot.

Conclusion

I can only assume that the current build of Chrome supporting Retina display is a bit of a resource hog.  I’d speculate that the fact that each tab runs a process taking 2.5- 3 % CPU quickly adds up and makes the machine run hot.  Certainly I’ve got 4 cores of Ivy Bridge at 2.6Ghz.  I’ve switched over to Safari, and done several tests – I can drain 20 percent battery using Chrome for half an hour, whereas using exactly the same sites in Safari barely dents battery life.  I’m not sure this is a problem that anyone else has had, but certainly for me I’ve checked this both on a 2012 Retina Macbook Pro (2.6Ghz) and a 2012 Macbook Air with similar results.  If anyone has any ideas as to what is causing this, or if they have had simliar problems I’d be interested to hear from you.

Apple’s Strongest Line Up for Years? – tell the analysts

Apple Inc

As the ever vigilant and dedicated Apple fiend, I find it hilarious as a demonstration as to just how OTT the analysts go.

Apple have got easily the strongest product line up across the board (with the only gaps that you could mention being the Mac Pro and perhaps the Apple TV) and yet despite this, and the fact that they reported the strongest quarter ever, the analysts still knock the share price. Seems to me there’s too many of these guys over inflating Apple’s aims for the quarter – no doubt in a couple of years they will be expecting the iTimeMachine: and then moving to a sell rating when it doesn’t materialise.

The iPad MiiniFurther more, when they respond to Analyst’s calls for products (case in point the iPad Mini) – they are critisied for doing it as a panic move. Really a company can’t win: perhaps they can use some of that 125Billion + dollars to buy some counselling to get over the hurt 😉

What any rational human being can say without bias is that there’s never been a time for Apple when their lineup is so strong. The iPod Nano and Touch are as good as they have ever been, the iPhone 5 has given us the long awaited larger screen with custom silcon and LTE, the iPad has gained the 7 inch product as well as the best in class 9.7 inch pad, and the basic Macbook and Macbook Pros are doing great business, with the new Retina systems further increasing the German esque premium brand. The Mac Mini is now as powerful as the 2010 Mac Pro and only getting better, whilst the iMac is looking just as svette as ever, and now even thinner. The transition from optical media to the cloud is completed, we’ve got interfaces that will last the next decade and the OSs (both OSX and iOS6 are both just becoming more polished with age.

Add to this a strong cash position, the ability to quickly integrate Intel’s tick tocks and the ongoing advancements in ARM – as well as remembering that a port of OSX is probably already running on ARM hardware, and it paints a very vivid picture of success to come.

Oh yeah: and they have Sir Jonny Ive 😉