Swift Heist, Apple Pay and APIs

It’s been a while since I last posted, and lots has happened in the payments world. Today I reflect on 3 items of interest: what happened in SWIFT; why the Australian banks are trying to bully Apple by forming a cartel; and the most exciting development in open access to financial institutions – APIs.

It was simply a matter of time…

There has been a lot said lately about the security of interbank payments. Traditionally these payments flow from bank to bank via the SWIFT network.

The Heist of US$81 million in February this year from the Central Bank of Bangladesh was just the start. Since then we’ve seen more: a Ukranian Heist of US$10 Million; Vietnam’s TPBank were lucky and blocked an attack; Eduador’s Banco del Austro lost US$12 million; a foiled breach of the accounts of the Union Bank of India to name a few.

It is important to note that the SWIFT network itself wasn’t hacked (that’s like saying the ‘Internet got hacked’) – what happened was that the fraudsters attacked the ‘endpoints’ – in other words the ‘terminal’ or the ‘PC’ that is sometimes used to input payment instructions into the network.

Fraudsters have moved from trying to ‘infect’ consumers or small business PC’s for small ransoms and are moving to the bigger fish – the banks.

In essence, fraudsters are using a mix of social engineering via email to trick bank employees into opening up a seemingly harmless ‘invoice’ or other type of ‘attachment’ that then secretly install a malicious payload onto their PC. This payload then waits in the background until it sees the users trying to do certain actions – say create a SWIFT payment – and then injects itself into the middle of that transaction to create a fraud. Worse still, the ‘malware’ steals the users user ID and password credentials so it can do its own thing.

Interestingly, the malware is sophisticated enough to try to cover its own tracks so by the time the bank finds out something is wrong it’s too late.

These methods take advantage of lax access and endpoint security within some organisations coupled with a lack of access controls and checking procedures. No bank should allow a single actor to create, authorise and release a payment (especially a multi million dollar one).

Luckily most larger banks don’t allow ‘terminal’ based access to the SWIFT network – ideally these things are embedded within data centres far away from unsuspecting users and use other ‘systems’ to create/validate/authorise/release payments. But as we have seen, some banks do.

For their part, I feel that SWIFT could do themselves a favour and introduce a set of base security standards for banks who use the system and seriously back this up by an accreditation program where each connected institution has to pass a series of risk based tests on a periodic basis. If you pass then good – you stay connected. If you fail – then you lose your connection until you pass. That is not to say that SWIFT have been sitting idle – they have in fact been very proactive with banks in terms of recommendations of security configurations. 

Banks on the other hand should be reviewing their payment practices to ensure that adequate access controls exist in their payments mechanisms and appropriate endpoint and perimeter security is employed.  Only a combination of efforts from all parties will prevail in combatting this challenge.

But as I said, it was always a matter of time. Someone was always going to try and attack a big fish……

I want access to your system Apple – or I’m going to see the Government …

In Australia, a collaboration of some major and minor banks have applied to the competition regulator for ‘permission’ to form a Cartel to collectively boycott Apple Pay in order to negotiate with Apple about access to the NFC chip that sits inside the iPhone.

Boo Hoo!

I think it’s disgraceful.

What’s this all about though?

With the release of the iPhone 6 Apple introduced an electronic wallet capability called Apple Pay. Originally launched in the USA in 2014 this meant that you could store your credit/debit card on your phone and use your phone at EFTPOS terminals equipped with tap and go contactless capability instead of using your actual card.

Google had also introduced NFC functionality previously within Android based smartphones. The implementations of the capability between Google’s Android platform and Apple’s iOS were different.

Android’s platform was open. Apple’s platform was closed.

What this meant was that Electronic Wallet developers/suppliers (such as banks) could ‘get’ access to activate the NFC chip in Android phones – meaning that they could develop their own capabilities to control the customer experience. Another important point in this was that they also didn’t have to pay Google to do it.

Apple’s implementation of Apple Pay was closed in that Apple did not open up access to the NFC chip to developers. Apple argued that this was to maintain and uphold the integrity of the embedded security around iOS and the iPhone. And Apple takes security seriously. 

In order to use Apple Pay banks and other financial institutions had to do this in partnership with Apple, and pay Apple a fee to do it.

Apple Pay was launched in Australia in November 2015 by American Express. In April 2016 the ANZ Bank had secured a partnership to launch Apple Pay. Since that time no other Australian bank has launched with Apple Pay.
Why?

Well, if you believe the submissions to Australia’s ACCC, the other banks wish to develop their own electronic wallet offerings and have greater control of their customer experience. They say that Apple is stifling innovation in this area and in order to have a level playing field then Apple should open access to the iPhone NFC chip.

Sounds like baloney to me. The biggest banks across the globe in the USA and UK have partnered with Apple. ANZ in Australia has partnered with Apple. These banks have also developed wallet applications for Android users too.

If you’ve used Apple Pay the user experience is fantastic. It is integrated with the iPhone and Apple Watch and it works great. I have, and I feel very comfortable with the experience and ‘security’ that I get with the platform. (I wouldn’t use an Android phone to make payments though. Too many security vulnerabilities in that ecosystem – remember access to NFC in Android is open whereas in iOS it is closed).

No, the other banks aren’t really interested in customer experience. This isn’t about stifling innovation. The banks are simply interested in the money. That’s what this is all about – the cash. At the end of the day they want direct access to the NFC chip so they don’t have to pay Apple to use it.

At least for now, the ACCC have declined the application by the banks to form a Cartel.

So, for the time being if you wish to use Apple Pay in Australia go to American Express or ANZ bank … because it will be some time before the other banks jump on board … they care so much about your customer experience and innovation that they’re prepared to not give you a solution at all instead of giving you one from Apple – because of course we all know that Apple are minnows when it comes to customer experience and innovation. (Being a little sarcastic there). It’s Crazy.

APIs – opening up the world to a future of capability

The proliferation of APIs within financial services is arguably the most promising advancement that I’ve seen of recent times.

An API is a piece of software that ‘talks’ to another piece of software. They are not user interfaces. They run behind the scenes, making apps and websites seamless and useful.

Using APIs means that developers don’t have to start from scratch. They are re-usable and flexible.

They also help create loosely coupled ecosystems.

By way of example:

  1. Think of an Apple power pack for a Mac. One end works all around the world. You just plug it into your Mac. To make it work in other geographies you only need to buy the right wall plug. You don’t have to buy the entire unit.  OR
  2. Think about Lego. The Lego system makes it easy for stuff to connect to other stuff. You could buy a dinosaur Lego and a car Lego and connect them together to form a ‘Car o saur’. You don’t need to worry about them connecting together because the system guarantees that they will work together.

This is how APIs work. And we use them everyday.

Have you ever uploaded a photo from your phone direct into Facebook or Twitter? If so, you used an API. It was there all the time, just sitting behind the scenes.

Alternately, what about using Uber. Uber didn’t create their own Maps system to make Uber work. They used existing maps providers such as Google and Apple. The maps API made the Uber app more useful and seamless.

From a corporate perspective I can see APIs being used extensively in the future. PSD2 in the EU zone is an example of where APIs will drive integration between banks and Fintechs/Corporates. In Australia the introduction of the New Payments Platform (NPP) will also drive it due to the nature of real time payments and the associated real time receivables. PSD2 impacts will also influence the way we use them in Australia. APIs will drive ERP systems integration, real time liquidity, bank reconciliation and receivables management, even perhaps the opening up of new bank accounts and servicing.

APIs will also help us unpack and leverage capability from developments in the use of Blockchain (more on that in another post).

The potential is endless.

Watch this space. APIs are the fuel of the future.

Bitcoin and Ripple – Searching for a good problem …

At the risk of sounding like I have an ‘unconscious bias’ against disruptive payments tech such as Bitcoin/Blockchain/Distributed Ledger, I have to say, at this stage I don’t believe that these technologies are ready for banking prime time – yet.

So why would I even say this anyway? Why am I interested enough to even write this?

Because I believe that the blockchain and distributed ledger tech have significant and substantial promise, but I think more work needs to be done on making the underlying technology ‘industrial strength’ – especially for regulated financial services.

“Come on Leigh – go back to your 16th century double entry book ledger you old relic” – I can hear them say.  And yes, I’m a banker of nearly 30 years – but my attitude has always been ‘why not?’ as opposed to ‘why?  I’m a prolific tech early adopter and constantly thinking of ways to improve the client experience especially in corporate payments.

I’m also a little tired of the hype created by the marketing engines of certain fintech companies looking for a large reputable financial service organisation to grab the technology, run with it and blatantly disregard the impacts of the deficiencies in the ‘current’ iteration of the technology. The hype just doesn’t help. It breeds conversation such as:

CEO: “We need a distributed ledger”

CIO: “Why?” ….

CEO: “because everyone else is talking about it”

CIO: “what will we use it for?”

CEO “that’s your problem”.

I almost feel like that we are searching for the problem (in other words a “good use case”) that these solutions have been created for. It so frustrating. I mustn’t be the only one thinking this however, have a look at this tweet.

Every day I read something that indicates the ‘promise’. Maybe that’s the frustrating part. Its like the search for extra terrestrial intelligence. We know its got to be out there somewhere but we just can’t find it yet – or the challenge of exceeding the speed of light – we stand back and marvel at concepts like “warp” from Star Trek and wonder how we will ever get to travel beyond that barrier. We have a stab at what the problem is because we already know what the solution might be.

What is Bank Grade anyway?

How about this for a definition?

Bank Grade payments systems are compliant, they address real challenges, they have clear ROI, and they are scalable.

My current analysis of the ‘current’ technologies to this point shows that they fall short on some or all of the basic requirements.

I don’t intend on giving readers an understanding of what Crypto Currency, Blockchain or Distributed ledgers are. If you are here, then its assumed that you have some knowledge of it, and are at the very least interested.

Lets run the test:

Compliant?

I fear not. Transactions are anonymous for the most part (although there is of course a hash based identifier). If you can link the identifier to an owner then they aren’t anonymous anymore. But the system doesn’t know who you are and because of that they are a regulatory nightmare. With ‘unpermissioned’ blockchains (such as Bitcoin) Entry and Exit points aren’t regulated and so it is virtually impossible to complete any type of sanctions or AML type checking. Recently, for example, the Department of Financial Services in New York introduced BitLicense to try and regulate these entry and exit points. The effect of this however was that most Bitcoin exchanges left New York and set up business in a less regulated environment.

Interestingly the kerfuffle was the community wanting to preserve the ‘anonymous’ nature of the network. Sounds very bank like to me.

With Ripple – If you download the ledger you can see each and every transaction. You can see who the counterparties are, who they trust, what their credit lines are, pretty much everything. Meaning that privacy goes out the door. Do you care? Do you want everyone to see your bank balance and each transaction you’ve done? Maybe that’s not a problem for you, but banks are built on keeping your financial data private. If you know a counterparties ‘account number/hash’ then you can track everything.

Forked Ledgers also present a problem. In banking you simply can’t have two or more versions of the truth. But inside the blockchain you can …. until they resolve themselves … and when that happens you can both win and lose. Banks need one version of the truth at all times – customers expect their balance to be their balance – not a ‘version’ of their balance. A recent fork occurred in Bitcoin that wasn’t resolved for some time – if you read that article words like ‘ your bitcoins are safe if <condition> x, y z, are met’. How would you feel if your bank balance was referred to as ‘safe’ depending on certain items? Not very safe perhaps.

One final point – control. Who regulates this? Who do you go to when something goes wrong? Who is the ‘central body’ that arbitrates? The answer – for Bitcoin at least – is ‘no one’. Having said that, it is the computers that arbitrate – they decide who wins and loses. They do the math. So who owns ‘most’ of the computers – at the present time this is the domain of the Bitcoin Miners. Miners solve the cryptographic problems that validation of the blockchain require, and in exchange for solving the problems they get paid  – in bitcoins. Two thirds of mining capacity is owned by 6 mining pools, and the top 4 pools are based in mainland China – not owned by Government but by private individuals/corporations. In the network then, whomever owns greater than 51% of the mining resource controls the network. For everyone.

So, transactions are anonymous; they are unregulated; they are not private; they are not ‘safe’ in traditional terms; can be controlled if someone owns more than 51% of the network; and you can’t complain to anyone if the system doesn’t work or you lose your money.

If that was a catchline for your bank, would you put your money there?

Address Real Challenges?

The main challenge that the distributed ledger industry favourite Ripple state they address is being able to ‘speed up’ cross border payments. Why? From what I can gather because the transactions flowing can be agreed in real time including FX rate conversion, intra party accounting is completed by way of using the distributed ledger and each party uses the protocol to agree on these things regardless of the time of day.

But does that mean that you actually get your cash faster?

Now that depends on whether the receiving bank has a way of interacting with the ledger in order to deposit the funds into your account real time (taking into consideration the relative sanctions and anti money laundering filtering that Banks are obligated to complete on any incoming or outgoing transfers). For larger transfers, or where there is high volume the bank also has to complete other hygiene matters such as liquidity management for its own treasury function (which is both good governance and good cash management – but you probably don’t care about that). All you want is your money and this takes time.

But the Ripple system works on IOU’s. In other words, the ‘instruction to transfer’ is still separate from the ‘settlement of the transfer’. Ripple is an ‘instruction to transfer’ system. It doesn’t do settlement – because settlement actually needs the transfer of actual money between counterparties. Instead Ripple issues IOUs between counterparties and they must use those IOUs to complete settlement via the traditional SWIFT network. Ripple is NOT a replacement for banks to use SWIFT because that is the way Banks settle their own balances with each other. There is an exception here  – if Banks hold the actual ‘Ripple’ currency they can just settle in that.

Unless of course each and every Bank ditched SWIFT altogether and all around the world invested their funds into buying Ripples.

Ripple ignore these factors, or at the very least brush them to the side. The ‘last mile’ (in other words actually getting the cash into a consumers hands) is a challenge – for Ripple and also for traditional Banks. Ripple still needs the ‘Bank’ to execute the cash transfer, typically via traditional means.

The blockchain itself can also be used for other real world challenges – for example this is where the concept of ‘smart contracts’ is heard. Each clause is a programmatic instruction, and once signed the clauses can execute automatically – and the ledger can keep track of which clause was executed, when, and for what/to whom. The contract can be for trade or almost anything – it doesn’t just have to be used for Bitcoin (or parts of a coin).

But we have systems of record today that solve these problems – perhaps not as efficiently – but solve nonetheless. This area has significant promise though – I think that with the passage of time the underlying blockchain technology will get better and will be embraced by smart innovators who use the blockchain as a ‘building block’ for interesting application that we may not have even thought of today. In effect the blockchain will be wrapped in IP layers of value added functionality that solve real world problems by leveraging the basic framework of a programmatic decentralized ledger.

Clear ROI

It costs a lot of money to run the Bitcoin network. Bitcoin transaction processing consumes significant resources – a recent study concluded that the combined electricity consumption for all Miners was comparable to the entire energy consumption for the country of Ireland.

As the blockchain becomes longer the computing required increases to a point of diminishing returns thereby making it commercially uneconomical. That’s based on todays tech though – Moore’s law will influence here.

Even with more corporate application such as ripple, as previously detailed you still need the banking infrastructure to complete the last mile. So given current use cases there is a lot of change required (in risk tolerance, acceptance of non traditional records of truth, and removal of traditional reconciliation practices) in order to provide enough ROI to balance it all out. At present there are significant unquantified risks with no clear mitigation strategy or payoff. Bitcoins themselves are subject to extreme currency volatility (good if you’re a betting person), transaction fees are relatively high (average 2%) and variable depending on Spam (called dusting), and the FX conversion from BTC to fiat is relatively high when compared to retail FX margins.

That is not to say that there are no opportunities. The tech just has to mature. Or we have to revise the underlying frameworks to better provide for financial services use cases.

Scalable

Another bummer at present. Bitcoin takes between 10-60 minutes to validate and clear a transaction. Unsuitable when compared to a real time credit card or bank account transaction.

Additionally Bitcoin only currently scales to around 7 transactions per second (by design). That’s not very good compared to what is required from global markets (tens of thousands per second or higher). You see – Size matters. Theoretically the throughput of the system could be much higher, but it requires a larger block size. Work has been done recently to increase the speed, but that requires consensus from the miners .. and they don’t agree. Taking us back to the ‘who controls the network’ issue.

I previously discussed the electricity costs.. which will suffer from the same diminishing returns matter and is ultimately dependent on how efficiently computers can solve the mathematical problems of validating the blocks.

On the ripple side, we need significantly higher participation from traditional banks and financial institutions to provide enough scale in order to start to replace traditional systems of cross border payments, but because of the challenges mentioned previously we just don’t have the scale required.

In Summary

I’m not against the technology. I think it has massive potential … with the right use cases and with solutions to the problems presented. And for those who think I’m just an old school banker that is afraid of the potential or disruptive change – think again. The discussion sort of puts you in a combative position which doesn’t feel right. I am on the same side – I want to use the technology . If I could go to my Boss and say:

“I have a solution for you. It will save in costs, it is super efficient and scalable, will speed up payments and improve customer experience, covers the bases in terms of fraud control, sanctions and AML, and help us improve revenues.”

Do you think I would? Absolutely.

I hope one day soon I can. I just cant right now. Its not bank grade – yet. Its a solution waiting for the right problem, but the solution and perhaps the underlying framework needs more work to be better suited for robust regulated financial services.

Post Publishing Note: Ripple did this past week announce “Interledger” as a possible solution for some deficiencies that I highlight above. I haven’t done a full analysis of the impacts of that innovation, but it does look like they’ve ‘wrapped’  layer of fresh IP around the underlying framework in order to make their use case stronger. Good work Ripple!

Is online banking dead?

I must say that when I say online banking I mean desktop banking via a web browser. Its so 90’s but still our Banks force us into an online desktop experience that they think we like. Most of us only do it because we have to, not because we want to. But is online banking dead? or at the very least dying? And what is this ‘digital’ word that keeps on popping up in numerous business related texts, and are the two topics related?

What does ‘digital’ mean?

To answer this I think we need to look at what is happening in the demand for online services across a spectrum of users, and what those users in general are doing. But first:

Forbes – IDC: 87% Of Connected Devices Sales By 2017 Will Be Tablets And Smartphones

Gartner – Gartner Says Worldwide Traditional PC, Tablet, Ultramobile and Mobile Phone Shipments to Grow 4.2 Percent in 2014

Just a few articles that you bring up when searching “Smartphones vs PC sales” on Google.

Smartphone shipments started to exceed PC shipments in 2011/2012. Since that time PC’s have experienced a small resurgence but nowhere near the volume of smartphones (including pads). You can see this in practice by just looking at what people use to connect these days – on the train/bus, in cafes, walking around – people are using their smart phones. Even office workers, whilst using their employer connected PC, have their Smartphone or Pad device not very far away – and for a number of reasons from ‘keeping their private life their own’ through to corporate IT policies that ban internet browsing, social media or the like so that individuals need their own device to do whatever it is they want on the internet (including Banking).

Most of us then, for a majority of our activities, use a smartphone or pad (lets call this ‘Mobile’ from now on). We use PC’s also, but the ‘first’ device we use is a mobile device (or for some of us we even use a wearable). We sometimes then move to a PC when we need to write or examine in detail certain types of content. Apple pretty much acknowledged this when they created their Handoff feature, and Microsoft are introducing a similar feature in their upcoming Windows 10. I must admit I love using Handoff – I can start something on the Mobile and then just move to the Mac (or vice versa) without losing any productivity.

A colleague recently told me that their interpretation was that:

Digital = Mobile First

And based on the above I would have to agree. I tested myself though first before completely conceding. Here are a few examples:

  1. I went to pay for something that I purchased for the family on an online auction site. When I went to pick it up the seller wanted ‘bank transfer’ for the funds. Guess what I used – the banking app on my mobile.
  2. What do I read the news on every day? – yup – my mobile
  3. Where do I read my email first? – you guessed it – my mobile
  4. Banking/Share Trading – What do I use first? – the app on my mobile
  5. Where do I get my online alerts? – my wearable/mobile.

So for me at least the statement is true – and I’d say this is the case with most connected individuals these days. Don’t get me wrong – I love my new Macbook Retina – but its not the first digital device I use each day. Jumping from meeting to meeting with a busy lifestyle means that I depend on my mobile for most things – business and personal.

Having said all of this – lets now try and tie back to the ‘online banking’ thing (remember – Online banking = desktop banking. Mobile banking = smartphone/pad banking).

If we are all using our mobiles first – how could online be thriving? Lets look at a few different customer segments to come up with an overall picture then.

Institutional clients – they are moving from the ‘interactive’ space to a non interactive space. Corporates and Multi Nationals wish to use the investments that they have made in their own systems rather than in the systems provided by the Bank. MNC’s would much rather incorporate a set of banking API’s into their investments in Treasury Management Systems or ERP systems such as Peoplesoft, SAP or Oracle. I haven’t seen a Corporate Treasurer yet that has told me that they love the proliferation of RSA SecurIDs or One Time Password (OTP) devices that they need in order to be on top of their multi bank relationships. Usually these things are hidden in a drawer or given to an assistant. In fact most people responsible for banking in MNCs or Corporates would much rather just use the application that their employer has provided and leave the banking part to sit underneath. SWIFT even developed the SCORE innovation in order to integrate direct into the back end banking systems instead of the front end on line systems. Banks then have evolved to provide corporate mobile apps for the busy executive so that they can use their mobile to authorise payroll or creditors transfers and the like instead of having to log onto the desktop service.

Business clients – similarly these clients are wanting integration with their accounting software suppliers such as MYOB, Quicken, XERO and the like. This market segment hasn’t got time to compare their journals to their bank account and do reconciliations – they want an integrated experience. This is where these suppliers are now looking to the bank to provide the back end integration traditionally used for the Institutional segments. This segment also increasingly uses Mobile devices for invoicing, merchant acceptance and payments and most banks therefore have provided mobile apps to satisfy customer demand.

Retail Clients – want the most convenient way to do their banking. With most consumers having a mobile device – and if they are like me then they use mobile first – most banking then is either using the mobile to check balances and do transfers, and using ATMS to get cash. You can even use your mobile now to get cash from an ATM without the card. Otherwise we all use credit/debit cards to pay for stuff. Retail customers are forced by their banks into online banking for things like ‘making a transfer above a certain amount’ or when wanting to ‘download statement data’ or the like – otherwise you use the mobile app or even most recently the wearable app. Online banking isn’t first, its nearly always second or third in the list. An argument could be made in fact that online banking is less secure than mobile banking (especially with the rise of biometrics versus simple username/password; banks building security right into the app instead of worrying about the risks of a “hijacked” Internet browser – ironically most banks using One Time Passwords (OTP) send them via SMS to your mobile anyway) – at the very least mobile is more convenient.

Notice the wearable banking app in this Apple Watch home screen shot (hint – its the red circle with the ‘W’)

Putting all of this together then an argument can be formed that if “Digital equals Mobile First”, with PC sales exceeded by Mobile (meaning that most people have a mobile), with the institutional and business sectors moving to a non interactive version of banking for payments and reconciliation (but using mobile for authorisation), and retail using mobile and wearable first purely for convenience  – that online banking – whilst perhaps not dead – is heading to the back seat.

I think so.

The Opportunity in Corporate Payments

There is currently a significant amount of focus on consumer payments which is understandable given the success of ventures such as PayPal, and the enablement of consumers to make payments easily by way of online credit card gateways and the like. Silicon Valley payments startups are trying to capture the massive volume in enabling consumers to pay easily.

Banks used to hold this domain with their customers.

The space changed between 10-15 years ago through the rise of the internet, and the slowness of many banks to respond to the needs of business who wanted to move their business online and then needed a payments gateway to facilitate the payment. Banks started to separate the ‘capture’ of the transaction to the ‘clearing’ of the transaction.

This meant that some (but not all) Banks sent their customers to newly found ‘credit card gateway’ companies whilst continuing to offer the underlying ‘merchant’ facility to the customer. This mean that the credit card transaction was ‘captured’ by the online gateway provider and ‘cleared’ by the Bank. Online gateways were agile, could develop programming interfaces (API’s) that allowed merchants to integrate their online shopping carts with the credit card system to enable payments. Banks were typically big and slow, and whilst offering a very reliable merchant service that underpinned the card transaction, they weren’t very agile at developing on-line API’s for merchants to integrate, and were less agile (or it didn’t even fit their business model) to deliver integration assistance to their customer base.

Nowadays then, if you are online merchant the typical model is to buy your card capture product from an online gateway (or even PayPal) and have that gateway integrate with their chosen Bank’s back end merchant facility.

The Bank part is a necessity (and taken for granted), the capture part is what the customer really needs. In essence, that is the product being purchased. Online gateways have been very smart, continually adapting and offering value added services (such as card reprocessing, tokenization for PCI DSS compliance), integration services, out of the box shopping card modules and the like.

What once was the domain of the Bank has now been commoditised and disaggregated from the Bank’s own offering.

As we move into corporate payments processing, it is important to examine the way that corporations ‘tender’ their banking business to the banking community. Traditionally, a corporate would issue an RFP for the provision of banking services. A shortlist would be created, various Banks would tender and more often than not a sole provider for the provision of Transaction Banking services would be selected. That provider would normally win the right to provide core banking services (such as bank accounts, liquidity (Debt), FX) and ancillary ‘value added’ services such as payments processing, merchant processing, supply chain, corporate card programs and the like.

When the GFC hit, corporations then had to diversify their banking relationships to reduce counterparty risk. This meant that corporates had to now spread their banking services across a number of banks, each relationship meaning a new online banking site, many and varied authorisation dongles/tokens, a different bank for a different region and so on. Each bank had their own particular way of doing Host to Host services, ERP systems integration, Treasury Management, Payments processing, receivables and the like. This has created a headache for corporates wishing to be more efficient in payments and treasury operations as most banks have their ‘own’ proprietary way of achieving this processing. The industry has done its best in agreeing on new standards of processing such as ISO 20022 but not every bank has adopted these standards, and most banks don’t agree on the bilateral mechanisms residing within the standards to achieve efficiency in payments processing. In essence, each integration with a Bank is a new project.

At the same time, most Bank’s haven’t changed their delivery model of payments processing product. They still respond to tenders for Transaction Banking solutions by offering the core banking services tied in with the ancillary value added services. This then by its very nature creates a further web of a corporate having to buy its banking product from many providers to simply do business.

What if you could by your banking product from a universal/independent provider (especially in payments processing and integration) and then let that provider do all the back-end ‘stuff’ with your chosen bank in your chosen region? How would that change your perspective?

To use the previous analogy, instead of the Bank doing both the payment ‘capture’ and the payments ‘clearing’, why couldn’t another provider build a payments ‘capture’ engine and then let the chosen bank do the ‘clearing’. With the world becoming more real-time and ‘instant’ there are methods to connect to banking services now that mean that a corporate could buy their product not from a bank and instead buy it from an independent supplier that is connected to the bank for clearing.

My hypothesis then is that transaction banking payment ‘product’ (capture of the payment) will become disaggregated from traditional bank ‘clearing’ systems. Bank’s will find themselves competing with tech companies for payments processing and for other value added services that simply need a smart way to connect to the bank whilst at the same time increase efficiency and reducing complexity for the client.

This creates an opportunity for tech startups to be innovative in payments and integration processing, aggregating between banks/corporates and using best of breed agile technologies to create efficiency for corporates who just want to worry about their own business instead of worrying about the Bank processing their payments.

The opportunity in Australia alone is huge – look at the below table:

High Level Transactions Statistics (apca.com.au)

Volume Value Users
DE (Credits) 5.3m/d $24.3b 307,027
DE (Debits) 2.5m/d $19.9b 24,164
Cheque 0.7m/d $4.8b
High Value* $96.5b
Debit Cards 312.1 m/m 18.1b/m 25.8m^
Credit Cards 164.9m/m 22.1b/m 26.5^

*These figures are values exchanged and do not include “own items”. Note also that a full picture of RTGS transactions would require HVCS transactions to be supplemented by Austraclear and RITS transactions which are not captured by APCA.

^Customer Payments Accounts cover day-to-day accounts and include: cheque, statement, savings and passbook accounts.

Banks will need to think differently about how they offer innovative product to market and the speed at which they do it. Their ability to change quickly to meet client demands will become increasingly important – especially if my hypothesis is true and they’re competing against tech companies.

Watch this space. It will be very interesting.

What will Australia’s New Payments Platform (NPP) leave as roadkill?

I must stress, before you start reading, this article then represents my own thoughts and no ‘corporate’ hypothesis. This Blog has been in draft for too long, and I’ve sought a fair degree of feedback along the way (thanks to those who helped mould my thoughts).

There is a new payments system coming into Australia in November 2016. Its called the New Payments Platform (original huh!) or NPP.

Why is it new?

Because the last payments stream introduced into Australia was Real Time Gross Settlement (RTGS) in 1998 some 16 years ago. The year before we had the introduction of BPAY (which isn’t actually a clearing stream, rather a product residing on top of a clearing stream). Both are still in use today, RTGS being more used in the corporate scene (consumers can use it over the counter at their bank branch), because BPAY was intended and is used for consumers to pay Bills. So, NPP is new, well, because in comparison the other stuff is old.

You can go to the Australian Payments Clearing Association (APCA) site to view the history of payments in Australia and look at how ‘old’ the other payments systems are.

Perhaps the other reason its ‘New’ is because the intent of the system is to be 24×7, with payments being settled between bank accounts at different banks within seconds rather than the current scenario of up to 30 minutes for RTGS and next day for Direct Entry. Additionally – and perhaps more importantly – the NPP will allow a significantly greater amount of ‘data’ to flow with each payment. NPP will be based on the ISO20022 standard for payments. Presently the payments streams only allow a very limited amount of ‘data’ to flow with the payment and over the years this has presented significant challenges to the industry especially in areas of payments reconciliation and the like.

For example – have you ever wondered why your bank only allows you 18 characters of ‘remittance’ information when you make a payment? It’s not because their on-line banking systems are unable to – rather it’s because the payments system that underpins your EFT transfer was built in 1974.

Back in 1974  – Gough Whitlam was PM of Australia, The Number 1 song was from Barbra Streisand (“The way we were”), Australia’s first Credit Card (Bankcard) was introduced and John Lennon made what would be his last stage performance in New York with Elton John. Oh, and 200 MB of Disk Storage cost the equivalent of US$186,000 today. In 1974 the storage in the most basic level iPhone would cost you nearly $6 million today! Try and get that on a 24 month plan from your Telco.

The point being that when the Banks designed the EFT clearing system, disk storage cost lots – and so you only got 18 characters to tell the person you were paying what the payment was for. And that same system is still in use today. In fact, it underpins the current on-line banking systems of most banks, credit unions and building societies in Australia processing 7.9 Million transactions per day worth $44.2 Billion (Source: APCA)

But what will happen when NPP commences? What payments systems will it kill on the way to Glory?

A few factors need to be taken into consideration;

  1. Will the industry impose transaction or processing limits for NPP?
  2. What will a payment cost?
  3. Will both parties (the payer and the receiver) be able to participate in NPP even if only 1 party has ‘paid’ for NPP enablement.

What would happen in Australia if there were to be no limits for a NPP Payment?

Most countries that have introduced a ‘faster’ payments system have also introduced a ‘system’ limit (or a set of system limits) for the use of that system.

Each day in Australia, on average over the last 12 months, (according to the RBA) the ‘system’ processes ~41,500 RTGS payments. If the context of a RTGS payment is to ‘make a near real time, non reputable and settled transaction to a beneficiary’, and the context of a NPP payment is more or less the same – what would be the continued need for a RTGS payment?

Perhaps not. Casualty number 1 – the RTGS system. Hold on a minute…don’t Banks charge for RTGS .. a search of all the big 4 banks shows that the average price a retail client can get a domestic RTGS transaction for is around $35 for a customer of that bank (See example here on page 25). Now assuming that the Bank doesn’t charge the same fee to its business clients as it does its consumer clients – lets apply a generous discount of 50%. That still means that, combined, and conservatively, the Banks are set to lose around $190Million per annum in RTGS fee income if that channel is cannibalised.

I think that if the system doesn’t impose a limit, the commerciality of the Banks probably will. $190M is a lot of dough …. RTGS will be hit, and perhaps hit hard. You can argue that the ‘context’ of the payment will determine the clearing stream – perhaps – but there are lots of variables here – intra day limits, intra day liquidity requirements, outside normal trading hours liquidity, RBA Exchange Settlement Account balances etc. etc. Time will tell, but the RTGS stream is almost guaranteed to be hit.

What will a payment cost?

Interestingly this has relevance to the last question. My answer though had a hypothesis based not on cost but on context – in that a NPP payment had the same ‘context’ or put another way, the same ‘characteristics’ as an RTGS payment. Why wouldn’t you charge the same amount for the payment as you did for an RTGS payment?

The answer I feel is in the target market.

NPP plainly has a market in consumer payments, (Person to Person (P2P) payments are ideal for this new mechanism) but the market becomes a little more cloudy as you make your way up the value chain into Institutional type payments. Payroll and large batch runs of direct debit and creditor payments for example will more than likely continue in the short to medium term to be effected via Direct Entry or CS2 – as it’s the most efficient low-cost and high volume system that we have. Also, these payments are due (traditionally) during a working week (Monday to Friday). Use cases for Institutional type clients are less clear.

If we decide to concentrate on P2P payments then these are more than likely made via consumer on-line banking – either from a desktop or from a mobile device. How much do you pay for a payment on that channel today? … probably zero, nada, squat. Is the convenience of a 10 second payment 24×7 going to spur you to hand over more cash to your bank for making the payment? Probably not. People are lazy and most don’t even understand or care how a payment is made. What I can tell you is that people start to care when they have to pay for stuff, and if an on-line payment costs ZERO today compared to SOMETHING tomorrow most people will start rubbishing the banks. The banks might say that you are getting a better service with NPP compared to before .. some might buy that .. but most won’t care and therefore won’t pay and will then shop around.

I’d expect most banks to replace their on-line banking clearing systems with NPP. Casualty Number 2 – Direct Entry. It won’t be a big casualty though. Not initially at least. But in time. Firstly however lets see what the banks price NPP at, as that will be a big determinant and maybe, just maybe, Banks could charge for NPP if they start to differentiate their payments to ‘immediate’ (NPP) and ‘delayed’ (Direct Entry). Perhaps people will be prepared to pay for an enhanced service when compared to the old way. Everyone nowadays wants stuff ‘now’. A friend recently reminded me how people get annoyed these days because you don’t immediately respond to ‘iMessages’ when the sender can see that you’ve ‘read’ the message. We are evolving into a ‘now’ community – this has relevance to NPP. People will want to see their money ‘now’ too.

I’d expect though that P2P payments for consumers will be fee free. The first bank to do this will set the market, and others will have to follow.

Will you be able to take part?

That’s interesting, and perhaps depends on the ‘what will it cost’ question. If there’s a charge – what happens if you’ve paid for the payment to be made and the receiver hasn’t paid his/her bank to make the service available? Will that bank put the money into the account immediately, or will they defer it?

Bank’s will want pay back on the massive investments made in a new clearing system, unless the ‘system’ determines that the cost is zero … and if there’s no charge then the point is moot as differentiation (as a fee for service) disappears.

Fraud

Another factor needs to be taken into account for NPP also – fraud. In a system that offers immediate non repudiable payments, from bank to bank, you can bet that the fraudsters will be out there trying to hack a way in. I can see them now just waiting at ATM’s for the money to arrive. Bank’s will need to invest in a lot of state of the art fraud prevention systems to protect themselves and their customers from fraud. It’s a big deal. BIG.

Data Services

Having said all of this, lets revert back to something I mentioned up front of this article – Data. NPP is based on ISO20022. Lots of data can flow with the payment – almost unlimited exchange of data and payment ‘attachments’ can be made. This might not mean much to a consumer (after all, you probably know what the money that ended up in your account was for), but it means a lot to businesses. They rely on this data to reconcile who paid them, and what it was for. Casualty Number 3 – BPAY. Previously you never had a comprehensive way to identify your payment inside a payment. BPAY in Australia solved that to some degree by going ‘outside’ the normal payments system and introducing Biller Reference Numbers that were based on a check digit routine so you couldn’t stuff the payment up. If the payment reference didn’t validate up front then you couldn’t make the payment.

With NPP though you can send a heap of data with your payment. You could even send your picture to them. I wouldn’t expect BPAY to lose much ground though – it’s still very efficient and well understood. However those utility type companies (such as telcos, councils etc) who are innovative and develop nifty online ‘overlay’ services to go with a NPP payment could perhaps offer ‘immediate’ reconciliation that accompanies the payment. They wouldn’t need BPAY anymore. That’s just one example.

Summary

As we move forward into NPP then I think that there will naturally be payments stream casualties. Some more affected that others. It’s not all bad news for clearing streams though because a big opportunity opens up via the Data and the overlay services that go along with NPP Payments. We never had a system before that had the potential to wrap so much data inside the payment mechanism. NPP does this well. For consumers the data may be irrelevant and the payment is most important, but for businesses the data is most important and the payment may become secondary.

Don’t get me wrong, it’s all based on the payment – we all need the money – and we want it ‘now’ – but data and online ‘overlay’ services have massive potential. The banks are having their ‘cheese’ moved, and they’ll need to work on fresh business models to keep relevance in an increasingly customer centric world and make up for their lost fees in other ways.

 

 

 

 

Same + Different = Different

I wrote a few weeks ago about ‘disruption‘ – insofar as the cloud impact on financial services and whether or not just having the cloud was disruptive.

However it seems that this word ‘disruptive’ is now everywhere. I have read no less than half a dozen articles on it this week alone. (For examples go here:

Disruptive Innovation in banking and 3 things that banks should do to protect themselves

The Disruption Machine

Australian Startups eyeing big bank insane profits

Its sort of like when you learn a new word and then you hear that word all the time. So this week I promise to not use that word any more in my Blog and instead concentrate on something that I’m very proud of – two Host to Host solutions that my team and I built at two major Australian Banks.

I learned recently that this years Peter Lee and Associates survey for Transactional Banking in Australia/New Zealand  listed those two Host to Host banking products as having the largest number of respondents/deepest penetration in this years survey. 92 Large Corporations responded to the questions about Host to Host, 55 of them (60%) reported using the products that my team designed and built. Of the 55 corporates, 29% reported an excellent experience, 66% reported an above average experience, and only 1 client reported a below average experience (‘rats’ on that point).

Why is this important?

Firstly, the Peter Lee survey is a measure used by all major banks in AU/NZ to evaluate their industry performance against each other in relation to Institutional Banking. Factors such as ‘Lead Bank’, ‘Relationship Strength’ etc. are highly sought after prizes. Peter Lee results are also used in marketing collateral and the like to prove a 3rd party view rather than just a biased view when it comes to RFP’s and the like when demonstrating one’s credentials.

Secondly, in less than 10 years, pitted against larger budgets of some international banks, large technology vendors and the like, our team has created now in 2 major banks arguably the best and most used ‘host to host’ platforms in use in Australia and New Zealand today. We did this with what I would call meagre (but not insubstantial) budgets and naturally a lot of hard work and dedication.

The teams collective innovativeness and experience has paid off twice now. At the start we backed ourselves in by taking a stand in what clients really wanted from a ‘host to host’ platform, and we never sought to just try to ‘buy’ it from a single vendor. In fact, we knew we couldn’t just ‘buy’ what we wanted – it needed to be created by using some best in class component products which were ‘wrapped’ together by a ‘custom’ layer. The first iteration started the journey, and we learned a lot. That product was even mentioned on that Bank’s annual report in 2010 (Page 29, look for ‘WIBS’).

Since that time we continued the journey building on the foundation of ideas that we had originally (again using best in class products wrapped with our custom but new and improved ‘layer’), rebuilding a new Host to Host product based on generic standards but stretching the innovation further (and of course customising to suit the target bank’s Environmental, Governance and Risk standards). At Bank #2 we repaid the original investment in just under 18 months from go live. Relatively unheard of in recent times for product/channel innovation in a big bank.

I had a mantra from day 1 at Bank #2 to drive people to do things differently. If you want the same outcomes, just use the same ingredients and the same recipe. If you want different outcomes you need to change the formula. Same + Same = Same. Same + Different = Different. The approach takes a while though. Old habits die-hard – slowly you can change things until they become the new normal. We wanted a different outcome – the formula works!

Where did the ‘magic’ come from though? Where does it sit now?

It’s in the people. Never underestimate the experience, value, drive and imagination of a tight group of people who understand deeply how to make ‘stuff’ happen – and who aren’t just technocrats – they are both Bankers and technologists.

To those who are or were part of the journey so far – thank you. So much has been done – we have proven ourselves, not once but twice now. Be proud as I am of what we as a collective have done – and look forward to the future.

Until next time,

Leigh

 

Do you really need BCP for Customer Payments?

BCP DR WORDSDo you think that a Business Continuity Process (BCP) is overly important?  Sure, it was something that you needed but it was always the lowest common denominator. If you had robust and resilient technology infrastructure then was it really necessary? Did you really need it? After all, if you had done your Technology job right:

  1. Your Highly Available architecture would compensate for individual component failure;
  2. If that failed you could component failover to the Disaster Recovery (DR) site;
  3. If that failed you could site failover completely to DR.

What could go wrong with that? How many failsafes do you need?

Have you attended meetings and when asked about DR and BCP said that if the three items outlined above failed then you probably had other things to be worrying about (cue “alien Invasion” or “Earthquake” or the like).

How arrogant! (but we all are overconfident at times).

You probably made an number of assumptions such as:

  • DR had actually been tested (and passed);
  • Recovery Time Objectives (RTO) were reasonable, acceptable and achievable;
  • Run Books and Plans were up to date;
  • Staff had the appropriate training or knowledge (and were available) and knew the passwords;
  • Vendor SLA’s and Contracts are in place (where required) and are up to date;
  • You had enough time to cut over (especially relevant for payments when system ‘cut offs’ are near)

These practices require diligence in order to be depended upon otherwise you just end up with the technology stack. And thats where you can go wrong.

I think that to a degree there is a general tendency to have an over reliance on the technologies that are put in place – an over reliance on the stack – and the stack doesn’t work all that well when the assumptions made above actually are incorrect. You discover that you can’t actually get the systems back up and running within the RTO, the firewall rules between Production and DR are different, the run books haven’t been updated from that new Database procedure you implemented 2 months ago, and where the bloody hell did the sys admin put the recovery passwords?

So what happens then when all that good stuff fails and you’re running payment systems? What happens to the property settlements, creditor payments, payroll etc.

That will never happen I hear you say.

It does and it will.

In the last 5 years I’ve heard and read stories of DR databases becoming corrupt because of production cluster replication screwing up the only good backup you have.

Stories of DR attempts failing because no one knew the passwords to the databases in DR.

Stories of microcode firmware updates for SAN controllers going bad during the upgrade leaving ‘High Availability’ controllers offline and not having any ability to access production data (cue the “I hope DR works” line)

Stories of  humans pulling both power supplies to a Fibre SAN during a DR failover attempt leaving the RAID configuration corrupt and then not having any data at all. Especially serious when your Virtual Machines running app/web/database exist on that same disk array.

Each of these ‘holy shit’ moments would cause a little fella at the back of your brain to pop out and say … “whats your BCP Dude”. Because out the back is an outage. And an outage in a customer payment system is measured in $$$ not processed per minute. In some instances when you are out the regulators know about it too (as well as your customers).

Sometimes recovery is okay – if it happens well before currency or regulatory cut offs you can get stuff back again, but if you are uncertain how long it might take or if you’re close to a cut off then you have to fall back … to what …?

Thats where BCP is actually very important. If you don’t have a robust BCP you will lose customers. Full Stop. Guaranteed. Because the day will come when you’ll need it. Maybe not tomorrow .. but one day. A good solid robust BCP is a great insurance policy and its just good business. It isn’t gonna stop the house from burning down, but it will minimise the damage caused.

So what does BCP for a customer payments systems look like (typically)?

I could be cynical and suggest that it looks a lot like life before the super reliance on computing came along. Its a process or set of procedures that was last reviewed 10 years ago that relies on a lot of dudes with PC’s running excel spreadsheets, text editors, and a crap load of email so you can move payment instructions around. It has little workflow associated with it, relies largely on legacy systems to do the ‘grunt work’, and is very, very manual. Often humans are actually manually keying instructions into the legacy core processing systems.

Due to the manual nature of it there is absolutely no way that you can process 100% of normal workload. Something has to give. So you prioritise and you end the day with less than 30% of your normal volume processed and hope that the system will come back up tomorrow (cue the Pizza’s and the Coca Cola).

IT DOESNT HAVE TO BE THAT WAY

I’ve experience now in designing and building two semi/fully automated BCP systems for customer payments processing. The idea is to get as close as 100% of the normal volume through.

You can’t do this with just Humans.

The key things that you need are (and I am assuming that your ‘core’ back end processing systems are running):

  • An independent technology stack outside of your main input channel.
  • A simple BCP focussed application that your customers can use to upload payment files or create payments.
  • A feedback mechanism that informs your customers and staff
  • A simple workflow (linked to the BCP application) that allows operational staff to manage payments
  • A straight through processing mechanism to get payments from the BCP application directly into your core processing systems (again linked to the BCP application)

If you have one of these you are well on your way. If you don’t, then you’d better have a pretty long fire hose and a lot of water.

Good Luck.

 

Leigh

 

 

 

Disruptive Technology and the Cloud

I did a presentation recently in Sydney. It went pretty well, by that I mean I was happy with what I had to say (and so were others).

The topic was “Innovation in Payments” – nothing new in my presentation apart from some of my thoughts on the Australian New Payments Platform (NPP) and “Least Cost Routing”; but along the way I’ve been asked about my thoughts on disruptive technology.

Somewhere along the road I’ve also been involved in discussions around the cloud and its impact on disruptive technology.

Having said all of this I still think that the Gartner predictions are correct – INTEGRATION outside the corporate firewall over the next 24-48 months will be key to the success business’ will need in the future.

In Financial Services there will be for some time a desire to keep things in house. No surprise there. The increasing challenge will be how much of it stays in house and how much can be done elsewhere. What is at the root of this desire though – is it protection of customer data, or is it something else that needs to stay inside, and how much of it needs to stay inside? (By the way, lets stay right away from arguing with regulators – that’s an exercise in kicking yourself in the head; one where you end up with a blood nose and they end up getting new shoes).

Is it the cloud that is nasty? Couldn’t you just say that the cloud is just another name for a ‘hosted’ service, or a ‘managed service’ in much the same way that ‘Digital’ now used to be called ‘online’ and the before that ‘eBusiness’, or ‘e’ this or that. Is it any different? Really? For example, isn’t SWIFT a ‘cloud’ service? and if its not, why not? (or are you going to say that SWIFT is a ‘hosted/managed service’). Leave that debate for later.

For the time being though, lets just call the opportunity then the ‘cloud’. Forgetting about the security concerns (do they really exist anyway?) What will the cloud let us do tomorrow that we can’t do today. What is the advantage that cloud could provide us with?

Lets go through a few topics:

Governance – would that be any different? Maybe – you have different end points and your relationship with the infrastructure provider might be different – and that would take oversight. Data Realms/Sovereignty might be impacted, and Security of data, So I think Governance in the cloud would need to be increased, compared to managing your own infra on site. Maybe the controls are different – maybe thats an opportunity in its own right for a clever security/risk management provider.

Software SDLC – Dev methods would need to change. Software needs to recognise a different way of residing on infrastructure. Older stuff might be okay in terms of ‘living’ on named instances, but one of the benefits of a seemingly never-ending infrastructure capacity is that your software should be able to take advantage of infra when it needs to. Spinning up new web/app/database servers when software demands, and then turning off when not required is truly ‘on demand’. Software needs to know how to do this and more importantly when to do this. Developers need to be building hardware ‘cloud’ abstraction layers inside their apps, or abstraction layers for legacy software in order to do this. So the Software SDLC is impacted also. Unless you don’t care – which in that case don’t do a thing. React to failures in the traditional way and waste your money on traditional incident management. Get it right though and you could argue that you don’t need incident management because the incidents are ‘programmed’ in and are automatically dealt with.

Infrastructure Provisioning and Management – Biggest impact. Why would I want to own a server when I can ‘rent’ a virtual one, and only pay for the time its actually in use. Use of the ‘Cloud’ transfers often huge capex bills every 3/5 years to an opex outlay. That has good and bad aspects to it though. Opex hits your P&L in the year of build. Capex doesn’t. In some countries you can capex and not have the impact hit your P&L (by way of depreciation) until you commercialise – and that could be years away. On other matters, you can use automation tools to clone/copy/spin up new servers in minutes. You can use API’s from the Software SDLC to control them. You can use the same APIs to configure networks/firewalls and other security devices on the fly. You only pay for what you need. Giving Software developers access to infra for their stuff to live on in a timely manner is a HUGE precursor to agile innovations. This gets a big tick – but don’t forget the Governance impact above. How do you define “in the cloud” as an ‘as built’ infrastructure from a documentation level – especially when it can dynamically change so rapidly.

Reliability/Redundancy – Another big impact. Its a bit like ‘use the force Luke’. ‘Let go’, ‘Trust your instinct’. You have to trust that your provider has this in hand. Or build it in another cloud somewhere else. I think the cloud has untapped potential in increasing up time and reliability. Netflix used this to their advantage in the deployment of a Simian army (http://techblog.netflix.com/2011/07/netflix-simian-army.html) that randomly tests their environment for redundancy. Have a read – its a great concept – and they build failure into their daily life. Having to get up at 3am and fix servers and stuff has now gone away. Organisations that build this concept into their infrastructure management plans just ‘deal’ with incidents as they happen without having to worry about key services being down. Everyone should latch onto this initiative. Traditional DR will continue to serve a purpose but get the ‘link’ between hardware and software right in a way that uses the resources properly and your DR will turn into a 1 hour verification rather than a run book that humans can stuff up.

Security – An area with the biggest Question Marks. How can you transfer all of those security appliances and protections to the cloud in the same way as you can with physical. I need to do more research on this, and this is perhaps one area that lends itself to the most innovation from cloud/security appliance providers.

Cost– All of the above. You will increase costs in some areas, and decrease in others. But the decrease should by far out way the increase. Jump on it I say.

Now, thinking about all that, let’s answer the question I posed:

What will the cloud let us do tomorrow that we can’t do today. What is the advantage that cloud could provide us with?

In short:

  1. We can do everything we do today (or should be able to)
  2. We should be able to lower the cost of infrastructure, and transfer the context of its impact to the Balance Sheet
  3. We should be able to build in reliability and redundancy, making applications look like they can ‘self heal’ themselves if part of the infrastructure fails.
  4. There are questions about security
  5. We should be able to develop and deploy in a way that fits a modern AGILE environment
  6. We should be able to do all of this, ideally, at a lower cost.

But wait on, Im still missing something.

Didn’t I talk about the opportunity to do something disruptive?

The cloud itself won’t manufacture disruptive technology. You still need the ‘great’ ideas for that. The cloud however should provide a ‘platform’ for disruptive technology to live. A Launching pad. You still need the great software, the great products – but in the future they will ‘live’ elsewhere.

Enough for now.

Until Later

Leigh