Tag Archives: data driven performance

How to Modernize Government Using Open Data Sources

In a previous blog post about modernizing government, I talked about why open data matters, and how it can be a tool of democracy. In today’s post I want to focus on open sources and some of the opposition posed towards open source development models. Open source as a development model, and having open data, is important for local government 3-1-1s because it helps provide more access to municipal information, demonstrates trends in the community, and supports accountability.

It’s not unusual for people to get these two confused, so lets start with some definitions courtesy of (fan favorite) Wikipedia:

Open Data: “Open data is the idea that certain data should be freely available to everyone to use and republish as they wish, without restrictions from copyright, patents or other mechanisms of control.”

Open Source: “In production and development, open source as a development model promotes a universal access via a free license to a product’s design or blueprint, and universal redistribution of that design or blueprint, including subsequent improvements to it by anyone.”

Like providing open data, using an open source digital strategy supports a transparent culture–especially for 3-1-1 systems–but also allows agencies to receive the benefits of an open source process. Govloop, in a documentation that highlights government trends, outlines the importance of open source in government nicely, saying, “Open source development accelerates government’s digital transformation by allowing agencies to reap the benefits of others’ progress. Secondly, it creates a transparent process that can foster public faith in these new initiatives…an open source approach ensures that digital initiatives will be maximally effective because it provides channels for users to report bugs and provide suggestions for improvement.” In summary, open source models allow both internal and external customers the ability to provide real-time feedback, which is valuable to all parties. What this looks like within a 3-1-1 environment, for example, is having the ability to see when a service requested has been received by a department, or having real-time dashboards that show what type of requests are being taken.

Like any model, open source has its critics. However, the primary criticism of open source is more conceptual than anything else, and rests in both theoretical incongruence, when applied to government, and cultural opposition. Ephemeral Journal published a compelling article by Nathaniel Tkacz on this very subject: “From Open Source to Open Government: A critique of open politics.” Tkacz points out that the idea of openness within a political sphere is rarely examined semantically and, in practice, political openness establishes a sensibility amongst citizens without defining limitations.

You can see how this could potentially be problematic for local government, but let’s not disregard our own democratic structure. If we view government as an entity that drives social change through democracy, than we must view the “(re)emergence of ‘the open,’” as Tkacz calls it, as a reflection of the government’s soci-transformative nature. Modernizing government also requires adapting to modern ideas. Promoting universal access is necessary because democracy requires informed citizens. The goal of any 3-1-1 is to serve the customer, and to provide them with tools that empower them. Open data is a tool that empowers citizens. In this way, an open source approach is both necessary and important for 3-1-1, and should be a priority for all branches of government.

Return on Investment (ROI) Model in Government – Does It Really Exists? Maybe…

The question of how government can track the success of profitless projects comes into question on a regular basis. It is easy to follow a dollar. Money leaves tracks, but how does local government leverage private practice metrics to better inform future projects and practices?

Non-profits use a different measure of value to reflect a more impact-centric formula to measuring ROI. Monetizable outcome and value have taken command of the popular imagination, yet motivation, beliefs, and ethical practice are equally important, and have defined value in the public sector. Regardless, the bottom line is investment creates more investment.

According to a 2008 report from the ROI Institute, and comprehensive measurement and evaluation process data from over 200 organizations, “Global trends in measurement and evaluation” indicate “increased focus is driven by clients and sponsors,” and “ROI is the fastest growing metric.” These two factors demonstrate that increased focus for an organization is directly impacted by the return. Impact can easily be interchanged with the public sector’s definition of value.

The relationship between return, and exterior financial support, points to an across the board paradigm shift between all sectors. Activity is no longer sufficient evidence to justify activity. Activity–whether it is a program, a project, an initiative, or the creation of a product–must be result based. In this there is a need to abandon ambiguous performance measurements, forge more social partnerships, and use efficient CRM systems that capture data. With this paradigm shift, we see government adapting to result based processes.

Dr. Jack Phillips and Patricia Pulliam Phillips note in their review, “Using ROI to Demonstrate HR Value in the Public Sector: A Review of Best Practices,” that ROI methodology is currently being used in the public sector in a multitude of ways by entities like the USA Veterans Administration, Australian Department of Defense, and U.S federal government agency. These entities are using ROI to “demonstrate program success and impact of training on educational programs,” “measure the impact of a new human resources information systems,” and to “measure the cost benefit of a master’s degree program conducted on site by a prestigious government.”

The emphasis on managing data isn’t simply a sporadic interest in government, or a trend that the public sector is suddenly jumping on board with. From a federal level the 2002 President’s Management Agenda (PMA) pinpointed five government wide goals that have influenced this contemporary line of thinking. The goals speak to the need for strategic management of human capital, competitive sourcing, improved financial performance, expanded e-government, and budget and performance integration. The PMA’s goals indicate a need to find a comprehensive formula for combining ROI metrics and analytics that support social impact, program evaluation, and quantitative data to measure both a monetary and a non-monetary return. The outcome of finding this formula would result in more than just saving a few bucks, and could potentially result in productivity and quality increases.

In an earlier document from the ROI Institute, Dr. Phillips provides an example of what this would look like:

“In a government setting, cost savings measures are available from every work group. For example, if a government agency implements a program to improve forms processing–a productivity measure is number of forms processed; the quality measure is the error rate on processing forms; a time measure is the time it takes to process the forms; and a cost measure is the cost of processing forms on a per-unit basis. Improvements in work unit performance in a government setting have many opportunities for program benefits that can be converted to monetary value.”

One of the ways that the Third Sector Organization (TSO), in the United Kingdom, has attempted to qualify social value of their sector is through developing a methodology: Social Return on Investment (SROI). The goal of SROI is to translate social, economic, and environmental benefits into monetary value. Yet the SROI isn’t necessarily applicable to individual programs and initiatives, and still prioritizes financial measurements over, say, what a social audit would result in: qualitative information combined with financial data that informs internal performance.

Ultimately, even with the strides that the TSO has made, there is still a global gap in knowledge when it comes to gauging impact on smaller scale profit-less items. A 2013, working paper from the Tellurid Science Research Center concluded on a similar note, stating:

“There is an extensive body of grey literature on impact measurement practice, however this has tended to be small-scale and boosterist in nature. The field has also suffered from a lack of theorisation of key concepts and critical appraisal of previous research, with a few exceptions. A number of studies are emerging which attempt to address this theoretical and empirical gap, but in general empirical research on impact measurement practice in the UK third sector, particularly which organizations and subsectors are undertaking impact measurement and the practices and tools they are using, is limited.”

Though there are limitations, the potential remains there for the public sector to find an all encompassing return on investment model, however no formula or practice standard exists at the moment. BUT there is still hope!

How are you measuring the ROI or SROI in the public sector? I would love to hear your feedback and suggestions.

Planning for the Future of Digital Services in Government

blog picI was recently asked in an interview with Govloop, a government focused social network and online publication, about how the City of Philadelphia is engaging citizens through digital services. Government is changing, and the conversation is no longer about why we need digital services for engagement initiatives, but how we can use them. The key to engaging citizens through digital services relies on getting to know your audience, having a strategic plan, using a wide range of channels to communicate with your customers, and listening to feedback.

The design of our digital service platform is entirely informed by customers. Both our internal and external customers’ wants and needs determine the service we will provide. Having a clear definition of your stakeholders, and framing your relationship around the question of, “how can we make you successful,” is pivotal.

In government, we have to be cautious about spending; as a result, the voice of the community must define what we prioritize in service. Like I mentioned in my interview, “We look at everything in order to define what we want to design…you have to bring the customer’s feedback to the table, not just the internal people. You need everyone’s ideas, but specifically you need to know what your customers want and then design something around meeting their needs.”

Data trends become more crucial when determining citizen needs. As citizens adapt to mobile lives, we see a need to meet the citizens where they are. Forty percent of Philadelphians do not have access to Internet in their homes; however, most have access to mobile devices. Knowing that we have to meet our customers, social media becomes an influential tool. For example, the City of Philadelphia Philly311’s Twitter and Facebook accounts, are able to connect with communities on an inherently social platform. Social media also offers us an opportunity to observe trends; what people are talking about, and what topics generate the most conversations. Being the 5th largest City in the US, means that individual communities have needs that are specific to that neighborhood. Monitoring social media is an excellent way to manage the various voices throughout the city.

In addition to social media, surveys are crucial in getting to know one’s audience. By taking surveys, we collect data that speaks specifically to issues. However, noticing trends, leveraging social media, and collecting data, means nothing if that information isn’t being put into action. Planning a communication strategy is imperative to creating a mainframe for the dialogue. Once you know what is working, creating a blueprint of how you got there, you can apply that template to other initiatives.

Find out more about what’s trending in government digital services, here:https://www.govloop.com/resources/future-digital-services-five-trends-transforming-government/

8 Tips to Get Your Team Using CRM in 2015 by Michael Hanna

A CRM implementation is more of a cultural change than a technological change. That’s because adopting a new system requires changing habits, and changing habits is hard. It’s hard for those who want to change, let alone those who do not.

Most people demand change, but resist it when it comes. Resistance to change is natural, so it’s crucial to help CRM users through the process of embracing change. When it comes to CRM adoption, users need your help, they need your reinforcement, and they
need that culture of accountability.

Here are four practical, actionable steps before, during and after the CRM launch.

1. Be Aware of Data Integrity

System-to-system consistency, or the integration of multiple systems, is crucial for a strong cadence and user adoption. If you’re migrating from one CRM to another, or merging CRMs, or changing CRM providers, ensuring the data is successfully merged and consistent is crucial to having data integrity. Without data integrity, this process often results in duplicate data, unstandardized or inconsistent data, and missing data. Preparing for these data mishaps in advance, as well as having tools in place to clean and prevent them from happening, will ensure CRM system integrity.

2. Be Clear About the Goal of CRM

Your CRM users are looking for the why behind the CRM system. If you’re not sharing this insight, you’re wasting your CRM investment because users simply won’t adopt it. Deliver clear rationale and a cause for your CRM. As a sales example, CRM gives users visibility that enables continuous sales improvement.

3. Hold CRM Users Accountable

It’s important to empower your CRM users, and CRM adoption should focus on that. However, empowerment without ownership is going to lead to neglect. You can give your users the most pristine, high-end CRM, but if they don’t care, they’re not going to use it. Establish the CRM users as the owners of the CRM, and then, in the context of ownership, empower them to use it. Otherwise, they’ll be negligent and passive.

4. Manage Detractors

If you’ve got ten sales reps using your new CRM, and eight of them are adopting it beautifully while two of them are struggling, you must work hard to get the two back on track. Stay strong and don’t lower your standard, or else the other eight are going to start to slack as well. By not diminishing your expectations, holding users accountable, and providing help and assistance, all 10 sales reps will be completely on board.

5. Demonstrate Real Results

Look for opportunities to showcase the relationship between CRM adoption and the positive sales performance that results from it. Explicitly call these results out when they happen. Here are three examples of these opportunities.

Sales reps quickly follow up on leads delivered in real-time via the CRM resulting in higher lead conversion. Call it out!

6. Provide Ongoing Support

Be extremely responsive to the sales reps’ questions and challenges, and try to support them in real-time. Refer them to your documentation and add their questions to your feedback list if you haven’t addressed it in your documentation.

7. Be Mobile

Reps do not want to have to go back to their desk and spend an hour everyday updating a CRM. This creates detractors. Allow users to update the CRM system in real-time, including while they’re in transit, when they’re coming out of a meeting, and so on. Mobile CRM enables users to access their CRM without pulling out a laptop and connecting to WiFi. Mobile deployment is a critical part of CRM user adoption.

8. Keep the User in Mind

Don’t introduce so much change that users can’t swallow it, and they can’t adopt it even if they wanted to. Pushing too hard or too much will deepen the mindset of existing detractors and create new ones. Your CRM success will only be as strong as the rate at which it can be adopted, not the rate at which it can be implemented.

The CRM adoption process is a journey, not a destination. When asked if the CRM adoption process is ever done, the answer is simply, “No, it’s not.”

Read more at http://www.business2community.com/customer-experience/8-tips-get-team-using-crm-2015-01135865#roodxkOyQwshISmO.99

Process Trumps Innovation in Business Analytics by Tony Consentino

I wanted to reblog this post by Tony Consentino, Ventana Research VP and Research Director,  because it was very insightful and thought provoking. In summary, when using or talking about big data, one should think of terms “What, So what, Now what & Then what”.

Read originally post by clicking this link: Process Trumps Innovation in Business Analytics

The idea of not focusing on innovation is heretical in today’s business culture and media. Yet a recent article in The New Yorker suggests that today’s society and organizations focus too much on innovation and technology. The same may be true for technology in business organizations. Our research provides evidence for my claim.

My analysis on our benchmark research into information optimization shows that organizations perform better in technology and information than in the people and process dimensions. vr_Info_Optim_Maturity_06_oraganization_maturity_by_dimensionsThey face a flood of information that continues to increase in volume and frequency and must use technology to manage and analyze it in the hope of improving their decision-making and competitiveness. It is understandable that many see this as foremost an IT issue. But proficiency in use of technology and even statistical knowledge are not the only capabilities needed to optimize an organization’s use of information and analytics. They also need a framework that complements the usual analytical modeling to ensure that analytics are used correctly and deliver the desired results. Without a process for getting to the right question, users can go off in the wrong direction, producing results that cannot solve the problem.

In terms of business analytics strategy, getting to the right question is a matter of defining goals and terms; when this is done properly, the “noise” of differing meanings is reduced and people can work together efficiently. As we all know, many vr_Big_Data_Analytics_05_terminology_for_big_data_analyticsterms, especially new ones, mean different things to different people, and this can be an impediment to teamwork and achieving of business goals. Our research into big data analytics shows a significant gap in understanding here: Fewer than half of organizations have internal agreement on what big data analytics is. This lack of agreement is a barrier to building a strong analytic process. The best practice is to take time to discover what people really want to know; describing something in detail ensures that everyone is on the same page. Strategic listening is a critical skill, and done right it enables analysts to identify, craft and focus the questions that the organization needs answered through the analytic process.

To develop an effective process and create an adaptive mindset, organizations should instill a Bayesian sensibility. Bayesian analysis, also called posterior probability analysis, starts with assuming an end probability and works backward to determine prior probabilities. In a practical sense, it’s about updating a hypothesis when given new information; it’s about taking all available information and finding where it converges. This is a flexible approach in which beliefs are updated as new information is presented; it values both data and intuition. This mindset also instills strategic listening into the team and into the organization.

For business analytics, the more you know about the category you’re dealing with, the easier it is to separate what is valuable information and hypothesis from what is not. Category knowledge allows you to look at the data from a different perspective and add complex existing knowledge. This in and of itself is a Bayesian approach, and it allows the analyst to iteratively take the investigation in the right direction. This is not to say that intuition should be the analytic starting point. Data is the starting point, but a hypothesis is needed to make sense of the data. Physicist Enrico Fermi pointed out that measurement is the reduction of uncertainty. Analysts should start with a hypothesis and try to disprove it rather than to prove it. From there, iteration is needed to come as close to the truth as possible. Starting with a gut feel and trying to prove it is the wrong approach. The results are rarely surprising and the analysis is likely to add nothing new. Let the data guide the analysis rather than allowing predetermined beliefs to guide the analysis. Technological innovations in exploratory analytics and machine learning support this idea and encourage a data-driven approach.

Bayesian analysis has had a great impact not only on statistics and market insights in recent years, but it has impacted how we view important historical events as well. It is consistent with modern thinking in the fields of technology and machine learning, as well as behavioral economics. For those interested in how the Bayesian philosophy is taking hold in many different disciplines, I recommend a book entitled The Theory That Would Not Die by Sharon Bertsch McGrayne.

A good analytic process, however, needs more than a sensibility for how to derive and think about questions; it needs a tangible method to address the questions and derive business value from the answers. The method I propose can be framed in four steps: what, so what, now what and then what. Moving beyond the “what” (i.e., measurement and data) to the “so what” (i.e., insights) should be a goal of any analysis, yet many organizations are still turning out analysis that does nothing more than state the facts. Maybe 54 percent of people in a study prefer white houses, but why does anyone care? Analysis must move beyond mere findings to answer critical business questions and provide informed insights, implications and ideally full recommendations. That said, if organizations cannot get the instrumentation and the data right, findings and recommendations are subject to scrutiny.

The analytics professional should make sure that the findings, implications and recommendations of the analysis are heard by strategic and operational decision-makers. This is the “now what” step and includes business planning and implementation decisions that are driven by the analytic insights. If those insights do not lead to decision-making or action, the analytic effort has no value. There are a number of things that the analyst can do to make the information heard. A compelling story line that incorporates storytelling techniques, animation and dynamic presentation is a good start. Depending on the size of the initiative, professional videography, implementation of learning systems and change management tools also may be used.

The “then what” represents a closed-loop process in which insights and new data are fed back into the organization’s operational systems. This can be from the perspective of institutional knowledge and learning in the usual human sense which is an imperative in organizations. Our benchmark research into big data and business analytics shows a need for this: Skills and training are substantial obstacles to using big data (for 79%) and analytics (77%) in organizations. This process is similar to machine learning. That is, as new information is brought into the organization, the organization as a whole learns and adapts to current business conditions. This is the goal of the closed-loop analytic process.

Our business technology innovation research finds analytics in the top three priorities in three out of four (74%) organizations; collaboration is a top-three priority in 59 percent. vr_bti_br_technology_innovation_prioritiesBoth analytics and collaboration have a process orientation that uses technology as an enabler of the process. The sooner organizations implement a process framework, the sooner they can achieve success in their analytic efforts. To implement a successful framework such as the one described above, organizations must realize that innovation is not the top priority; rather they need the ability to use innovation to support an adaptable analytic process. The benefits will be wide-ranging, including better understanding of objectives, more targeted analysis, analytical depth and analytical initiatives that have a real impact on decision-making.


Tony Cosentino

VP and Research Director