I’ll be speaking at this event next week in Edinburgh about VRM and the Mydex initiaitive.
Also, moves are afoot to get a Scotland based ‘chapter’ up and running to do some local pushing forward on VRM initaitives.
At the VRM West Coast workshop, Don Marti led a session on Personal RFP’s, which then led to the issue being debated further on the mail list and built out in this post by Alan Mitchell. Here’s my contribution, looking as much from the CRM/ recipient perspective as the VRM one – ultimately I think that until we look at both simultaneously then we won’t get much up and running at any kind of scale deployment.
Firstly, I think we need to get our terminology in order; as Craig Burton pointed out…we do not yet have a clear VRM lexicon accepted and understood by all project participants.
Here are a couple of references from Wikipedia, that relate to/ illustrate the background to the terms Request for Information (RFI) and Request for Proposals (RFP). I think we need to look at both in tandem because typically they interact with each other.
Request for Information – A request for information (RFI) is a standard business process whose purpose is to collect written information about the capabilities of various suppliers. Normally it follows a format that can be used for comparative purposes. An RFI is primarily used to gather information to help make a decision on what steps to take next. RFIs are therefore seldom the final stage and are instead often used in combination with the following: request for proposal (RFP), request for tender (RFT), and request for quotation (RFQ). In addition to gathering basic information, an RFI is often used as a solicitation sent to a broad base of potential suppliers for the purpose of conditioning supplier’s minds, developing strategy, building a database, and preparing for an RFP, RFT, or RFQ.
Request for Proposal – A request for proposal (referred to as RFP) is an invitation for suppliers, often through a bidding process, to submit a proposal on a specific commodity or service. A bidding process is one of the best methods for leveraging a company’s negotiating ability and purchasing power with suppliers. The RFP process brings structure to the procurement decision and allows the risks and benefits to be identified clearly upfront. The RFP purchase process is lengthier than others, so it is used only where its many advantages outweigh any disadvantages and delays caused. The added benefit of input from a broad spectrum of functional experts ensures that the solution chosen will suit the company’s requirements. Effective RFPs typically reflect the strategy and short/long-term business objectives, providing detailed insight upon which suppliers will be able to offer a matching perspective.
I think the background to these terms is key to how we must think of them in VRM world if we are to understand how best to deploy them. What does that mean in practice?
In addition to these characteristics, it is also worth noting that over time intermediaries have emerged (e.g. TEC) who, amongst other support services, make whole series of standard RFI and RFP templates available at no or low cost in order to stick themselves into the value chain.
My view of the above is that a) the originators of the terms RFI and RFP now have finely honed processes for dealing with them, they do enable win-wins for buyer and seller, and intermediaries have emerged to deal with some of the hard stuff – like finding common terminology, and b) they are typically not automated processes and thus not not at all like what will actually be required to do the things we have commonly described as Personal RFPs in VRM discussions, (e.g. i’m here, and I need a stroller for twins).
SO: Before we progress, we may wish to change our terminology around the RFI/ RFP issue – to more accurately reflect what the individual needs; otherwise we risk being confused with the prior deployments of the terms which actually have very little in common with what the individual might deploy right now.
Here’s my view of what those needs are:
If we look hard enough we’ll find that there are already architectures out there, that do 2, 3 and 4 – and bits of 1 are around that can be picked up and added in, either directly or (more likely) via fourth party services. For example, the architecture below has been doing its stuff on the web since way back in 2000; a proposition called 2busy2surf that was way ahead of its time. Unfortunately that business has now gone, but the architecture and buyer-seller matching engine has been white-labelled into 20 or so propositions since then. It is still churning out stacks of permissioned requests for information and requests for proposals, and returning matched information packages or offers. These come direct from the selling organisation, or more typically through the affiliate markets (third party services).
So, to get what we used to call personal RFP’s up and running, what we need to do, in my view, is:
That’s going to take a bit of time and effort. It’s on the agenda for the User Driven and Volunteered Personal Information working group at Kantara; this group has now been approved and will be up and running shortly. I’ll post the details on how to join that as soon as I have them.
I’ve had a couple more validations of this theory in the last few weeks, so thought i’d best write it up. My hope is that we can use the upcoming VRM Workshop to get the VRM story refined and presented so that we can reduce the number of meetings required to get to the detail of why an organisation should consider ‘VRM enabling’ itself.
So, here’s my theory:
It takes three, fairly in-depth meetings before a smart, typically senior CRM/ Customer Management/ Customer Experience executive in a large customer-facing organisation to genuinely ‘get’ VRM and where we are coming from with the project and mind-set – and thus what’s in it for them.
Here’s how it usually pans out in my experience:
Meeting One: This usually happens on the back of an existing contact who has heard/ read some snippet about ‘VRM’, or can also be in one of the more in-depth, small-group presentations that I and others have run in the last 12 months or so (mainly UK).
The outcome of this meeting, from the perspective of the CRM/ CM Exec is usually along the lines of ‘These people are well meaning, are obviously committed to their ‘hobby’, but a bit mad and naive as to what us big organisations have to deal with; but at least i’ve done my bit for keeping an eye on innovation in my space’. Alternately, the shorter meetings can be driven by ‘don’t these people realise that we’ve just spent a zillion pounds on our CRM application and need to get that to work because we’ve told everyone it will’.
Most CRM….meet VRM discussions finish at this stage….for now anyway.
Meeting Two: Let’s say that at best one in twenty of the above meetings end up with a follow up meeting, and that many of these are through personal ongoing contacts (where CRM/ CM work is going on in parallel); or that sufficient time has passed since meeting one for an update to be of possible value.
This is the meeting during which ‘the penny drops’….but typically only in connection with a very small nugget of opportunity, often one which is front of mind for the exec at that point in time. Examples would include:
– yes, I know our data quality is shockingly bad….., you mean we could work with our customers to fix that…..? Or
– so you mean we could accept these highly qualified leads into our existing CRM system with hardly any tweaks….? Or
– so our customers can help us refine/ define our new products if we engage in the right way?
The outcome of this second meeting is usually….’let me think about that’; and ‘is there anything up and running as a genuine VRM application that I can have a look at?’
Meeting Three: So now we’re down to a very small number of ‘almost converts’. These third meetings are typically much more ‘CRM/ CM/ CE Exec driven’ and are about:
– where do I see this stuff? (i.e. we are usually showing some of the behind the scenes development projects at this stage)
– how can I access it to play around with it, prototype it and build proofs of concept in my domain?
– can you meet up with our innovation folks to talk about a possible pilot?
Underpinning these third meetings is usually the realisation that what we VRM folks are talking about actually has a very sound economic argument, and also that we are about ‘win win’ rather than consumer activism for the sake of it.
What happens after meeting three? I don’t know to be honest, we’ve not had any yet that i’d count as such – although there are a couple lined up for June and July. I think for those meetings the challenge falls back onto the VRM community, or those of us building VRM type solutions – we need to be able to answer the ‘meeting three challenges’ loud and clear.
What does that mean for Project VRM and our workshop this week? I think we need to get better at telling our big and complex story, probably in bit sized chunks and in accessible ways – a good web site for example. I think we also need to focus on getting some real, live pilots and proofs of concept out there to be engaged with. Let’s pick up on that on Friday.
Lastly, i’d have to add that the record for ‘getting it’ is actually nothing like my three meeting theory – it was about twenty minutes and the only question at the end of that was ‘where do we sign up’?
Welcome to our new web site and blog, at last moved over from static pages to WordPress.
This is where i’ll be blogging about my ‘day job’, customer information strategy consulting, and also my growing involvement in next generation customer information such as that proposed at Project VRM.
(Cross post from Right Side Up)
Here’s an interesting presentation given to the Direct Marketing Association annual Privacy and Data Protection conference last week by Marc Dautlich of Olswang, the lawyers who are advising the Mydex CIC.
The whole deck is useful to those of us interested in privacy and data protection, but the concluding slide is of wider interest – Marc predicts that Volunteered Personal Information will become the dominant marketing paradigm by 2015. Let’s hope so…..
(Cross post from www.informationmasters.com)
The Information Masters benchmark evolved from the original Information Masters research dating back to 1996, and first written up in this book. Over time, the underlying assessment framework has been enhanced to take into account emerging issues, new best practices, developing technologies and the evolving business landscape.
For example, the issues noted below have all had a sizable impact on the Information Mastery landscape over the period.
The Internet – Still in its infancy at the beginning of our research, The Internet and ability to utilize it effectively in managing customers is now one of the most significant differentiators of organizational performance.
Data volumes – When the research study began, the worlds largest customer were measured in the tens of Terabytes. Now we wonder if Teradata will have to re-name themselves Petadata……
Moore’s Law – But as data volumes grow, that old faithful law means that data processing remains relatively quick and in-expensive.
Government – Having come late to the party, government across the globe are investing heavily in information-related initiatives. Whilst the ‘kit’ is much the same as in the private sector, their motivations differ.
9/11 – This single event negatively transformed customer expectations of data privacy, the collection and mining of travel related data being most impacted.
Return on Customer – Whilst one of a number of articulations of the need to focus on customer value, the 2005 Peppers and Rogers book clearly set out the importance of this metric to investors as well as to customer managers. Unfortunately, as our benchmark begins to un-pick, few can deliver against this metric in practice.
Mass Deployment of CRM Applications – Some have succeeded, many have failed to meet expectations – all have generated significant amounts of data.
The benchmark is based on the seven high-level competency areas identified in the original Information Masters research study. These seven areas divide into a further 44 sub-sections of specific practice areas (e.g. Privacy).
The current benchmark is based on 51 assessed organizations. Organizations from multiple sectors (inc Financial Services, Telco, Utility, Publishing and Public Sector) are represented. The benchmark includes organizations from USA, Canada, Europe and Australasia.
Organizations within the benchmark will always be anonymized, unless they grant express permission to be named.
The benchmark, at its current level of evolution can be used to
– Identity specific practices that drive competitive advantage for organizations (versus ‘me too’ activities)
– Draw comparisons between leaders and laggards
As we build up the benchmark over time we will increasingly drill down/ draw comparisons;
– Other facts that may emerge as differentiators (e.g. channel use, age of organization)
Current benchmark scores in total and across the seven competencies are shown in the table below:
From this current benchmark profile we draw some high level conclusions:
1. In overall terms, organizational ability to become an Information Master has progressed little over the ten year period of our tracking, albeit with much change taking place that impacts on this overall lack of progress.
2. Most organizations have access to plentiful data across the various types required to drive customer management.
3. The ‘softer’ issues around leadership, organization design and driving a data culture remain the most challenging of the competencies, and remain the primary differentiators.
4. The availability of the number and type of people to support information initiatives has improved (however demand still outstrips supply).
5. Organizations have invested heavily in technology over the period, which is good as an enabler; but in of itself is a relatively small component in the Information Mastery jigsaw.
When we begin to look within the seven competencies we can identify some key trends and factors that drive differentiation in organizational performance.
– Organizations now have access to more than they have ever had before.
– Data volumes are growing quickly and will continue to do so for the foreseeable future.
– Capturing and using unstructured data is a recent challenge (think web 2.0/ user generated content) that organizations have not as yet solved.
– Data quality remains a major challenge; many organizations fail to even define the problem space far less address it successfully.
– The ever-increasing types and volumes of data gathered bring Privacy concerns to the forefront; this will be a key area of differentiation over the coming decade
– Data security problems/ data breaches are an inevitable factor of the current lack of control over data gathering and use; these compound and fuel the privacy issue already noted above.
– Business leaders have had to manage significant new complexities through the rise of the Internet, both as opportunity and threat.
– Many new business model options can now be considered, along with more intermediation/ dis-intermediation and new partnering options.
– Driving successful ‘ownership’ of the information capability remains the key challenge
– Driving a data culture is easier that it was ten years ago as new recruits are more likely to have a grounding in information, and industries overall have learned a lot and become more scientific in their approaches (to customer management) over the period.
– Putting the right measures (as opposed to many non-actionable measures) in place remains a challenge, not helped by increasing organizational complexity.
– Conflicting customer versus product orientations continues to drive an Information Mastery lag in many organizations.
– Organizations continue to struggle to invest in/ manage Information Mastery as one infrastructure program, and outside of silo-based approached.
– Outsourcing has become a valid option in specific components of information mastery, although one that is not always successfully deployed.
– Too many information programs continue to be deployed via technology function push rather than business user-pull.
– How one organizes the specialist support functions that underpin Information mastery is now relatively well understood.
– Scarcity of skilled resource is no longer a critical issue, although demand continues to outstrip supply.
– There is a general trend in place towards pushing information skills out from a core, specialist function to the main business management teams (although a strong, central core remains a critical success factor).
– The tenure and/ or experience of the leader of the information specialists is typically a key differentiator/ enabler of success.
– The best performing organizations have mapped the information requirements of each role and deliver to that spec.
– Business processes are increasingly differentiated by how well they tap into the underlying information assets and infrastructure; specific processes such as dynamic pricing require real time access to multiple types of data.
– Strategic and operational analytics have evolved considerably over the period.
– Many organizations continue to struggle with the bundling of products and services into the ‘solutions’ which customers often demand.
– The optimization of Marketing and Selling activity is the focus of much current effort and expenditure – much tactical success can be seen but strategic improvement remains questionable.
– Technology development continues to outpace organizational ability to successfully deploy it.
– Major advances have been made through the availability of CDI/ MDM (Customer Data Integration/ Master Data Management) technologies; these tool-sets (solutions) recognize the multi-threaded, typically cross-organizational nature of the underlying problems being addressed.
– Advances in Business Intelligence (BI) tools have improved data access across organizations (although if not addressed as a data culture issue this can create more problems that it solves).
– Hosted applications and business models aid/ simplify deployment; but are only truly successful if supported by a well planned information infrastructure recognizing the specific nature of working with a hosted environment.