Sunday, January 29, 2012

Getting out from beneath the cell phone giants

My wife and I are currently paying $135.45 (before taxes) for our cell phones which gives us 700 shared minutes and two unlimited data plans. In an effort to save money we are biting the bullet and considering alternatives to bring this perpetual cost down.

Spoiler: We are going to switch to a $2/day prepaid plan with no contract. We are going to use google voice for most of our phone calls for free and will buy an adapter ($50) so that we can use a regular phone to make Google Voice phone calls from our house. We can then adjust our usage to effectively pay $0 / month for our cell phones instead of being locked into $135 month after month. 

Our requirements are:

  • one data plan for GPS navigation and web browsing – we decided we could share the phone with the data plan when needed
  • great cell coverage for emergency phone calls
  • ~500 text messages / month
  • ~700 minutes / month

Coverage for emergencies is a high priority, so this limited us to Verizon and ATT. This was based on looking at coverage maps and asking around.

Now that we are down to two companies – it is time to think about the plan.

My first thought was that if we joined forces with another family then a family plan might be worth it… I built an excel spreadsheet and found two things

1) Unlimited plans would be more expensive than we are paying now

2) 2000 minute plans would only be slightly cheaper than we are paying now

(note that this is based on 2 lines with 2 data plans per family)

image

This was unacceptable when factoring in the drama that comes from working with another family (fighting over minutes, handling money, etc.) so we moved on to prepaid…

We learned that prepaid with Verizon was not an option. They charge you $10-20 for texting, plus the voice part, plus the data part. ATT prices were close to the same as Verizon but included text messages making Verizon about $10-20 more expensive.

So onto ATT prepaid. Lets first talk about data. Data options are

image

Which means nothing if you don’t know how much you use – so off to Verizon to see how much we use…

image

Ouch – line 5961 (me) is using a lot – this pre-paid thing (as far as data goes) might not work out. Line 6447 could get by with the $25 plan which is $5 cheaper than currently paying Verizon. I called ATT and asked what would happen if I went over the 500MB.

“If you have money in your account, and your account configured to allow you to go over – then you will buy another 500MB.”

This is good news in the sense that I can use more than 500MB but means I would need 3 subscriptions per month totaling $75. Not so good.

But – this is all a mute point because I am not using that much anymore. At my last job we were not allowed to stream radio – so I was using Pandora through my phone. At my new job I can stream through my computer – so what is my usage like now…

image

So this usage extended out to a 30 day month, is ~260MB. Much better!

So data is $25. And the best part is… because I am not under contract (by going prepaid) I can turn the data on and off at will. I confirmed this by calling ATT and they do not require data with a droid on a pre-paid plan. Worth noting that iPhone has some special data requirements and isn’t eligible – but this didn’t effect me so I didn’t spend any time to figure out what that meant. 

Now to the voice and messaging of pre-paid.

The basic plan is

image

this provides a benchmark for comparing to the other plans. The pay by minute plan looks like this

image

With text messaging unlimited, but only 250 voice minutes included – the question becomes how many minutes would I have to use – to be better off with the unlimited plan – in other words - what is my breakeven point. A few calculations later and we arrive at 500 minutes. This is not looking like a good fit for us.

So how about the pay by day plan.

image

This plan also has unlimited texting  so the question is how many days would I have to use my phone to be better off with the unlimited plan. At $2 / day – it is simply 25 days.

It is worth taking a pause here to explain the pay by day. If you use your phone to send or receive a call, or send a text message, that is a $2/day charge. Yes, receiving texts are free. Once you you pay for  a day, you have unlimited calls and texts.

Back to the analysis – if we can keep it under 25 days – than we would be better off with the pay per day.

So for the final analysis we need to compare the prepaid to our contract plan. Our current Verizon bill $135 – we have already decided to remove one data plan at $30 so we arrive at $105. Prepaid monthly would be $50 per phone + $25 for one data plan. Totaling $125… but what about pay per day.

The breakeven point for pay per day with one data plan at $25 and 2 phones paying $2 / day is 20 days per phone.

Our conclusion:

We are committed to bringing down our cell bill and the prepaid gives us the greatest flexibility to do that. We like that if we are trying to save money we can easily cancel data temporarily and limit our phone usage to potentially emergency only. We would gain the option to not use our phone for a single call or text and effectively pay $0 on any given month. And during that entire month we would still have coverage and the capability to make an emergency call.

Heavy usage months would cost us $125 which is more than contract $105. But we plan on having enough savings throughout the year to offset these heavy months.

To help facilitate limited cell phone usage we are setting up google voice accounts for each of us. We will be able to make calls from a house phone using google voice (going through the internet) mitigating some of our cellular needs.  Note that Google Voice is currently free. We will not give out our cell numbers and instead give out our google voice number. We will configure Google Voice to forward to the house phone, our computers, and our cell phones. If we are unable to answer on the computer or the house phone then we can choose to answer with the cell phone.

Angela won’t have a data plan but will be able to connect her Droid over our wireless network. She will still be able to use all of her apps (like facebook) but just not while out on the road.

We will limit calls while driving and instead make those calls from home.

We ordered an Obi phone adapter so that we can connect a regular phone to our google voice account.

http://www.amazon.com/OBi110-Service-Bridge-Telephone-Adapter/dp/B0045RMEPI/ref=sr_1_1?ie=UTF8&qid=1327870783&sr=8-1

Saturday, January 7, 2012

[Book Review] Crucial Conversations: Tools for talking when stakes are high

Title: Crucial Conversations: Tools for talking when stakes are high

Author: Kerry Patterson, Joseph Grenny, Ron McMillan, Al Switzler

Link: http://www.amazon.com/Crucial-Conversations-Patterson-McMillan-Switzler/dp/B004H7422O/ref=sr_1_2?ie=UTF8&qid=1325958089&sr=8-2

Crucial Conversations: Tools for Talking When Stakes are High by Kerry Patterson, Joseph Grenny, Ron McMillan, Al Switzler

Overall I found this book to be helpful in putting a finger on such an impossible skill as managing a Crucial Conversation. Over the years I have found myself engaged in crucial conversations where the skills described in the book are necessary but undefined. I am confident that my marriage and relationship with my children will be improved now that we can use a consistent language to manage the Crucial Conversation much more efficiently. For other crucial conversations where I cannot expect stakeholders to be aware of this language (like co-workers) - I can now review a concise cheat sheet and help steer the conversation towards a healthy output.

The authors have a diagram they intend to be a tool but I find it to be unhelpful.

image

It is clearly too abstract – probably b/c they don’t want people to skip reading their 200+ page book. Well I can’t read the book before every Crucial Conversation, so I decided to create my own.

image

In creating this, I found that I simplified some of the skills. I effectively combined the Master My Stories (not shown above) with STATE my path and explore others’ path because I feel that this should be a collaborative process where everyone is playing by the same rules. I think the authors separated it so much because some things are outside your control while exploring others’ path, but I feel like I want to be thinking about those things for both.

I also did not include Move To Action as I feel that is a separate problem. It is surely related but there are great resources that focus on just that problem and in the context of a crucial conversation, if we can get through high stakes, varied opinions, and strong emotions to come up with a solution. We can take a break, maybe even go to another cheat sheet to find a way to Move to Action.

I simplified a concept called CRIB which is meant to Make it Safe. CRIB stands for Commit to seek mutual purpose, Recognize the purpose behind the strategy,  Invent a mutual purpose, and Brainstorm new strategies. I found this to be highly repetitive and therefore overly complex. I replaced it with find common ground. This simple statement provides a great reminder if the conversation has gone to silence of violence to make it safe again by finding common ground.

I ignored one of the clever stories. I found that victim and helpless are two similar to clutter my cheat sheet. I found that when I tried to find a difference – a victim had a villain at which point its more of a villain story.

Though I know I am an improved human being after reading this book, I was disappointed during reading at how long the book is. The book is under 250 pages but could easily be under 100 pages. Much of the simplification I did in my cheat sheet was being performed by my mind while I was reading. It is disappointing to finish reading a paragraph or page and then say – wow – that didn’t really add anything new.

I do recommend this book.

Wednesday, December 14, 2011

declaratively prefixing a method call with base for readability

So one benefit of working with people much smarter than me is gaining new insightful perspectives.

One topic discussed today is using this and base when calling methods / members from a derived class. I argued that I found it helpful to prefix a call with base when the method I was calling was on the base class. That when I read the class later in life, it is clear to me not to look for that method on this implementation and to go look at the inheritance chain. As an example:

    public class Vehicle
   
{
       
public string Honk()
       
{
           
return "honk honk";
       
}
   
}

   
public class Truck : Vehicle
   
{
       
public void Turn()
       
{
           
base.Honk();
       
}
   
}

So here my truck is leveraging Honk from the base class in some other method. One brilliant co-worker went into how this is bad from a readability perspective because of how the mind processes things visually. In other words – what follows is very hard to read…

 public void Turn()
       
{
           
base.Honk();
           
base.Honk();
           
base.Honk();
           
base.Honk();
           
base.Honk();
           
base.Honk();
           
base.Honk();
           
base.Honk();
           
base.Honk();
           
base.Honk();
           
base.Honk();
       
}

Note the calling the same method over and over is not the point but that having the repeated base. prefix is just distracting. A more concrete way to look at it: if you are able to parse say 6 characters of each row as you scan through the code, the first 5 are taken up with base. leaving you with less room for the part that really matters.

While I think this holds a lot of weight in an argument to avoid prefixing with this, I still stand by my comment that the base. is helpful and is used sparingly (generally) to not really be a visual distraction.

But – in comes another brilliant co-worker with a totally different perspective. “What happens ,” he says, “when you override that implementation in your derived class...” Now that would kind of suck. A change (overriding honk) with have side effects on everything in the derived class that was calling it (like Turn).

So in conclusion, I stand by my argument, that I think prefixing with base adds readability, I now respect that it must be used sparingly for visual reasons, and the added value isn’t worth the cost if you might end up implementing that method in your derived class later. I love working with people smarter than me!

Sunday, December 11, 2011

RTM Charts–YAY!

So I am a pretty heavy user of  Remember the Milk. By heavy user, in 2010 I completed 3,095 tasks.

image

As I have converted a weakness (forgetting those little things I commit to) to a strength (thanks to RTM) by remembering everything; I have found myself with a new problem. If you remember all the little things – you won’t have enough time in the day (or money to pay someone else) to do all those little things… Oh No!!

So I have had to become a master at prioritizing (having a 4 month old at home forced me to become a master very quickly).

This prioritization brought me to a new problem – if its not a priority today – then where do I put it? I used to “postpone” it for some arbitrary amount of time (2 days or a couple weeks). This naturally led to a high maintenance problem of re-postponing tasks over – and over again! One solution was to remove the due date on tasks that don’t need to happen. I know that ones going to come back to bite me as that list (those without dates) just keeps getting bigger. But for my immediate sanity – that is helping a lot!

But some tasks are a near term priority (this month) and should not slip into the black hole of “no due date.” So I can push it to a day that I think should be free – but as I keep pushing tasks to that day – it soon is not so free.

So I finally bit the bullet and built a visualization to help me solve this problem.

image

This shows me how many hours I have committed to each days, 7 days out. Using this I can push things to the right day, spread the load 7 days in advance as days come into focus, and breathe a sigh of relief as I am actually completing what I set out to do, instead of drowning in task inundation!!

So how did I do it?

I leveraged a REST service RTM has exposed and a C# API to access the REST Service available here. Note there is a bit of a learning curve and trial and error figuring out RTM’s authentication model that I am not commenting on right now.

I then created my own REST service that returns JSON using WCF.

Lastly, I wrote some javascript using google charts api to actually draw the visualization. Thanks to google, it is (slightly) interactive; I can hover over the bars and get the exact measurements.

image

Now back to getting those 7.5 hours worth of stuff done!

I built this for use by me but if there is demand, I could build this to be more user based so that you could log in and see the dashboard for you. So if that is something people want – let me know.

jon

Friday, December 9, 2011

Simple WCF Rest Service

I recently found myself struggling to get a simple Rest Service working through WCF. I have very little experience with WCF and my Rest experience has been through ASP.NET MVC. As easy as it is in MVC I was expecting it to be easier in WCF (after all there should be less to worry about).

After struggling for 30-60 minutes and fumbling with how to word my problem in google, I finally found this posting which saved my day…

http://agilewarrior.wordpress.com/2010/12/19/how-to-create-a-simple-wcf-rest-service/

Long story short, change the Factory and gut the web.config – now that is easy!

<%@ ServiceHost Service="Services.InvoiceService" Factory="System.ServiceModel.Activation.WebServiceHostFactory" %>

BTVWAG: Software Craftsmanship from a Product Owner's Perspective

I attended my first BTVWAG meeting and was very impressed with the turn out. I didn’t count the attendees but I expect there were around 30. I was especially surprised to see no one I recognized from the Vermont.NET User Group. In fairness – the .NET meeting was Monday so two meetings in 3 days is a bit much – especially during the holidays – but it was well worth it.

Rik Dryfoos gave a great talk on Software Craftsmanship. Though the talk was aimed at management / product owners (ultimately getting buyin), there was a little something for everyone, and ultimately getting the coders on management on the same page regarding code quality is greatly needed – and now we have a term for it – Software Craftsmanship. Why do we need a term for it – whats wrong with Code Quality? In short, in my opinion, its because writing great code is more of a craft, than an engineering discipline. It takes a new term to move heads.  

If you weren’t able to attend – take a few minutes and race through the slides available here.

Now for my own notes from the meeting:

Clean Code: A Handbook of Agile Software Craftsmanship by Robert C. Martin was added to my Wish List (hopefully that means someday I will find the time to read it)

The term YAGNI was mentioned briefly (You ain’t gonna need it) and I wasn’t familiar with it, so a quick google search led me to wiki where I realized that it is the idea that you should wait to do that really cool thing until that really cool thing is needed. I have a poster on my wall that communicates this same message which can be found on slide 22 here (I added a screenshot below).

image

Code smells was also brought up with a great link here. I hope to eventually find some time to automate the measuring of these code smells. I know fxcop does a lot of them but wouldn’t it be great if you could measure the total stench of your code?! We have to be careful though – measuring gets us a little too close to engineering..

One smell that was new to me was Cyclomatic Complexity which in my own words – measures the the number of computational branches your code can take. In other words if there are no If statements than the complexity has only one path and a Cyclomatic Complexity of 1. I find this smell particularly interesting, because I could use it ask myself, is this module complex for a good reason or did it just turn out complex.

The Software Craftmanship Manifesto was introduced and on its old style paper layout it feels something like joining a brotherhood

image

During the presentation there was an analogy to a blacksmith that was made – which is quite fitting with the old style paper above… One part of this analogy that really hit home for me was that a manager can’t mentor a blacksmith, it takes a blacksmith to mentor a blacksmith. I personally have experienced this level of micromanagement and it simply does not work. You can’t tell the blacksmith how long its going to take to mold the metal – he has to tell you.  

On one of the slides, it was mentioned that Rik would give his company a Red (not so good) for Pair Programming, which isn’t much of a surprise as most companies struggle to implement Pair Programming. I asked him what it would take to be Yellow or Green. I was surprised by his answer. He said that for Green, every piece of code would need to be Pair Programmed. I respect and appreciate his answer but personally find that to be excessive “craftsmanship.” I have done Pair Programming and can vouch for the value. We were able to come up with a much better design than either of us would have come up with on our own, and we both had intimate knowledge about how and why it was designed the way it was that can’t be compared to trying to “show” the how and why afterwards. But, then we went our separate ways and worked on the less challenging parts of the design in parallel. I can’t imagine the boredom if I were forced to Pair Program an entire application. To go back to the blacksmith example, do you tell the blacksmith that it takes two master blacksmiths to heat up the metal? We might want two when it comes time to shape the metal or some other intricate part of the process, but not the whole process.

Towards the end of the talk during some questions – the importance of the ability to say No was brought up. We need to ensure that our programmers understand the value of speaking up and are enabled to say no. For too long, we have been the blacksmiths that return a piece of metal that soon breaks, because we are never listened to when we say – the fire won’t get hot enough or the metal won’t cool long enough.

I appreciated the talk and especially thank Rik for his time and contributions to a better community of software professionals.

Dynamic LINQ – back to strings!!

 

Background:

During some recent refactoring – I was converting some old NHibernate code that was using the Criteria API to instead use Linq-to-NHibernate – I ran into a major roadblock. In my previous code – I was receiving parameters from my client regarding pagination, sorting, and filtering. Converting the pagination wasn’t too bad, then I got to the filtering and sorting…

My client (javascript) has passed me the string for the column it wants sorted. Linq wants a strongly typed function pointing to the member to sort on. I really don’t want to go write a translation layer that converts every string they could send me to a valid Linq Expression… so then I found Dynamic Linq.

Dynamic Linq:

You can find a blog about it by Scott Guthrie here: http://weblogs.asp.net/scottgu/archive/2008/01/07/dynamic-linq-part-1-using-the-linq-dynamic-query-library.aspx

You can download the sample which contains the code here: http://msdn2.microsoft.com/en-us/vcsharp/bb894665.aspx

Down to the Code:

I set up a proof of concept – available here: https://github.com/jhoguet/DynamicLinq---Nhibernate-Proof-of-Concept

Filtering:

So how can we filter on a column name available in string format…
       
[Test]
       
public void ProofWhereEquals()
       
{
           
var person = _dao.GetPeople().Where("FirstName=@0", "Jon").Single();

           
Assert.AreEqual(expected:"Jon", actual:person.FirstName);
       
}

and note this gets translated all the way down to NHibernate generating the following SQLite code

select person0_.Id as Id0_, person0_.FirstName as FirstName0_, person0_.LastName as LastName0_ from "Person" person0_ where person0_.FirstName=@p0;@p0 = 'Jon' [Type: String (0)]

not bad – how about something more realistic for a filter… LIKE

        [Test]
       
public void ProofWhereContains()
       
{
           
var person = _dao.GetPeople().Where("FirstName.Contains(@0)", "o").Single();

           
Assert.AreEqual(expected: "Jon", actual: person.FirstName);
       
}

which generated…

select person0_.Id as Id0_, person0_.FirstName as FirstName0_, person0_.LastName as LastName0_ from "Person" person0_ where person0_.FirstName like ('%'||@p0||'%');@p0 = 'o' [Type: String (0)]

alright I am impressed – but surely it will break if we do a NOT

        [Test]
       
public void ProofWhereNotLike()
       
{
           
var person = _dao.GetPeople().Where("!FirstName.Contains(@0)", "o").Single();

           
Assert.AreEqual(expected: "Manrique", actual: person.FirstName);
       
}

which generated…

select person0_.Id as Id0_, person0_.FirstName as FirstName0_, person0_.LastName as LastName0_ from "Person" person0_ where  not (person0_.FirstName like ('%'||@p0||'%'));@p0 = 'o' [Type: String (0)]

alright I am impressed!

But can it handle the order by problem…

        [Test]
       
public void CanOrderBy()
       
{
           
var person = _dao.GetPeople().OrderBy("FirstName descending").First();

           
Assert.AreEqual(expected: "Manrique", actual: person.FirstName);
       
}

which generated…

select person0_.Id as Id0_, person0_.FirstName as FirstName0_, person0_.LastName as LastName0_ from "Person" person0_ order by person0_.FirstName desc limit 1

sweet!

At this point I was satisfied and moved on – I know I read some blogs that spoke of limited support within DynamicLinq – but it certainly covered what I needed (I vaguely remember one example LIKE ‘StartWith%’)

I did find one surprising side effect (I might argue bug).

 [Test]
       
public void WhenArgIsNullBadThingsHappen()
       
{
           
try
           
{
               
var person = _dao.GetPeople().Where("!FirstName.Contains(@0)", null).Single();
           
}
           
catch
           
{
               
//you need to check for null in your implementation
            }
       
}

 

When you pass in null as the parameter you get some misleading exception

System.Linq.Dynamic.ParseException : No property or field '0' exists in type 'Person'
Took me a little while to figure out tha tit was looking for a property of ‘0’ because I passed in NULL, but my fix was to add a code branch looking for nulls and avoiding the additional LINQ altogether if the argument would be NULL.