Tuesday, January 29, 2013

Apple made Square: Democratization of payments


The democratization of access to payment processing is being driven in large part by an affordable, well-distributed hardware system that could run the payments processing effectively-- the iPhone/iPad. In order to keep investment costs low for vendors, the variable cost for a payment processing system had to be low, or close to zero. Software, especially through venues like the Apple “App Store,” has a marginal cost of zero because it can be easily copied and sold- all with just the initial investment cost. The iPhone and iPad, as well as accompanying App Stores, created a platform technology, whereby many merchants would already have iPads, and if not, the cost of obtaining one would be low, and discounted by its multi-purpose nature. Instead of having specialized hardware just for processing payments, which has to be manufactured, shipped, and installed in merchant locations, the Apple platform enabled an almost purely software solution to the payments problem. The only lacking feature in the hardware is a sensor to read the credit card, but this cost is marginal compared to the overall cost of typical card processing hardware. By making initial investment costs zero (the sensor cost is so low that it could be given away,) barriers to entry for small merchants are significantly lowered.

        Traditional payments processing companies had little competition because of the massive capital required to create the necessary partnerships and hardware. Since merchants depended on their product, payment processing companies like Hypercom had little incentive to improve or upgrade these POS terminals. This created an opportunity for startups that had everything to gain by providing a superior product. A pre-existing app and device ecosystem put in place most of the hardware, and the majority of the distribution channel (apart from a credit card sensor,) enabling a startup to focus on improving user experience, and making payments fun and innovative. This would put such products at a significant advantage with an increasingly tech-driven population. By enabling a new startup to focus on software product, and user experience, and especially reduction of initial investment, the app ecosystem allowed for massive payments democratization.

This is an incredibly disruptive trend because no longer are potential merchants constrained by the cost of entering into business- the beauty of democratization is that now, anyone can be a merchant. Barriers to entry on all consumer-facing businesses are significantly lowered by changing payments processing from an asset to a rent- merchants can simply pay for what they make, or pay a fixed cost monthly. Thanks to payments democratization, all consumer-facing industries will increase in competitiveness because small vendors can afford to go into business, making skill and quality of product more important than startup capital. While there are still significant barriers to entry for small vendors, payments, one of the most crucial aspects of any business, has started to become significantly more democratic. I think we can all appreciate that.

Wednesday, December 26, 2012

Tomorrow should be better than today.

I don't like startups. No, too harsh. It should be "I don't like most startups," or "I don't like a lot of startups," or "I don't like startups that are unoriginal and use technology, but are not advancing technology or applying it in new and interesting ways." but those seem to be a lot less attention grabbing. I pulled up my favorite startup editorial website this morning and found, to my great surprise, startups that don't do anything interesting. Photo-sharing apps, mobile social networks, utility-social combinations, and everything else involving combinations of those words. Hugely talented, intelligent, educated people working eighty hours a week to build a variation on a mobile social network, when both the functionality (network with people on the go) and technology (mobile phone, OS, app) have already been invented. We are working off of platform technologies, and for the most part, the functionality of these startups has been accomplished by the Giants of Tech. 

It seems to me like Silicon Valley is full of copycats. Potential entrepreneurs see billion-dollar exits and think "I could have made that!" They then proceed to make it, even though it's already been done. Not only is the idea unoriginal (many of the best companies don't have original ideas,) but the execution has already been done too. With a large share of the market, it's highly unlikely that a new product with no new technology, with a slight spin on the theme, can capture any significant portion of the market. The premises of these companies are as false as the mortgage-backed securities of '07. The tiny marginal benefits of these startups don't qualify as "better" in my opinion. But there is hope.


The good startups are out there, pushing the edge of technology. People living out of garages building internet-connected robots, companies building SaaS that cuts marketing and IT costs of companies everywhere, companies that are figuring out how to make computers learn- building intelligent software that understands the past and predicts the future, these are the Tech Giants of tomorrow. Good startups have vision: they don't want to continue the status quo: they seem a problem, and they try to fix it, or they see a technology, and they understand how it can make life better. For me, the rule of thumb for a good startup is: Tomorrow should be better than today.

Monday, May 14, 2012



My education is kind of unusual in that I'm not really a "business guy" or a programmer. By studying both as an undergrad (EE/CS + Business), I've somehow magically landed into a slightly awkward zone. When I apply to work for a company, there's always the choice bubble: Software Development Intern, Business Intern, etc. I always delay having to make that choice. I had an interview last fall for a bigger company, and I somehow avoided that question until after my interview. The guy asked me "So... what position are you applying for?" "Uh... Where do you think I would fit best?" I wasn't getting that job. And the thing is, I didn't want it, because there was nowhere in that company that I would both fit and be fulfilled.

The conventional wisdom is that specialization yields results, greater efficiency, and less chaos in corporate structure. Even startups tend to have this bifurcation: BD or Dev? Yeah, it makes sense in some ways; "if you can't code, then you shouldn't try." I don't know how much I like that logic, though, because it's often extended (wrongly) to say: "if you can code, you can't do anything else." It reminds me a lot of high school, where the perception was that the good writers weren't great at math and science, and the science nerds couldn't write at all. But here's the deal: That wasn't true at all. It was something mediocre kids said to make themselves feel better, like "they're just really specialized at one thing; I'm a jack of all trades." What actually happened was this: the science geeks were most often almost as good at humanities as those considered the best writers, and vice versa. Kids that did really well in either thing were good problem-solvers, and that applied to anything.

The geeks did science because they loved it, but to go back and say to these kids "You can only do science from now on," and to tell the humanities kids "You had better stick with writing, rather than try to catch up with the science kids," is crazy. It would close off boundaries and hurt productivity more than it would help it. This problem is parallel to the business/programmer divide, and it happens every day. Software Developers should be given the opportunity to take on more business oriented roles, if only to "see the other side," and to understand why some restrictions are placed on their coding from a business standpoint. And not all business guys are stuck on high horses, not wanting to get their hands dirty coding. I know several Wharton students who started programming and now they are among the best programmers I know, even better than most of the engineering students I know. To close people off into these boxes from what they are hired as closes off that potential; it's hard for people to develop a skill on their own when working a full time job that doesn't require it.

My suggestion: hire people knowing what they have done previously, and what they know, but also hire them based on how they think and reason out problems. Good problem-solvers can be business or software guys. A business/programmer divide forces people to ask themselves "Is this a programming problem or a business problem?" Only if they answer is their specialty will they actually get the job done. On the other hand, more flexible roles would permit people to just do the task while they are capable of either doing it, or figuring out how it should get done. Trust me, this do-while statement is much more productive than the former if-then statement.