Business rules approach

I find my self thinking a lot about Business/IT alignment. One thing is clear Enterprise Architecture (EA) MUST be relevant to Business stakeholders as well as IT. If it serves just one side, EA becomes irrelevant and less effective.

To this end … I’ve attached a good paper which talks about Business Capabilities and EA.

This leads me to the idea of “Business Rules” that underpin Business Capabilities.

The Business Rules Group (BRG) is focussed on the business perspective of business rules.

 

Business Rules ~ from the Business Perspective 


 

From the business perspective,
…a business rule is guidance that there is an obligation concerning conduct, action, practice, or procedure within a particular activity or sphere. 

Two important characteristics of a business rule are:

  • There ought to be an explicit motivation for it.
  • It should have an enforcement regime stating what the consequences would be if the rule were broken.
Business Rules ~ from the Information System Perspective 


From the information system perspective,
…a business rule is a statement that defines or constrains some aspect of the business.  It is intended to assert business structure, or to control or influence the behavior of the business.

Interested? Read on …

Business Rules Approach

Dachis Group Collaboratory: Experience TV

Link to Dachis Group :: Collaboratory

Experience TV

Posted: 24 Feb 2011 01:38 PM PST

Experience TV | Stuzo

[This post is a re-posting of a blog from the Dachis Group]

In my last blog post I wrote about TV ads being flat.  I feel that the entire TV experience needs improving and that none of the big players have gotten it right.  In this post I’m going to lay out what is missing and what I think the TV experience should be.

TV should be social and have a lightweight Experience Layer that enables consumers to engage with what they are watching in a semi-passive manner.  The application platforms by Samsung, Sony, et al. as they are today don’t have what it takes to transform the TV experience.  All of the apps that I have seen interrupt the native experience of watching TV.   More importantly, TV manufacturers lack the data needed to create a truly seamless and engaging experience.  In order to create an experience that is both semi-passive and truly enriches the native TV experience, there needs to be a meta-data layer that unlocks the signal and content.  Information bytes such as what channel am I on, what am I watching, who is that character, what is that object in the background, and what is happening at this minute in the program are the missing links.  Access to this type of data lies with the cable and content providers, who have, unfortunately, not yet realized what a powerful experience and business value they can unlock by working together.

The other major piece missing is the social graph.  The TV needs to know who I am and what I care about.  As such, Facebook should be the identity and relationship management platform of the TV. Just as cell phone makers are integrating Facebook deep into their phones, so should the social graph be deeply integrated into the set top box.  Consumers should have a profile with preferences just as they have on Facebook and those preferences should carry over throughout the TV watching experience.

The last thing that needs to change is the remote control.  Where is my Like button? Where are my Info or Buy buttons?  Where are the swipe actions?  This type of a remote could be replace the current remote types or be powered by a stand-alone mobile app.

Now that we have all of the pieces, let’s get to the fun part: the experience.  The platform that I am envisioning is not a completely open platform.  Anyone can build an app/experience, however, the cable providers and most notably the content owners will have the final say on what can and can not be integrated with their programming.  For those familiar with how Facebook’s platform works, think of the set top box as being the Facebook Platform, the program (show or movie) being a brand owned page on Facebook, and the networks being the corporation that owns multiple brands. Anyone can build an app on Facebook, but not every app will make it onto an owned media channel, a brand owned page on Facebook or in the case of the TV, a program.  Below, I am going to highlight three types of experiences that provide tangible value to both the consumers and brands:

1. A commercial (Super Bowl ;) )
Imagine the Darth Vader Volkswagen commercial:  what if you could roll over the car and Facebook Like it; what if you could roll over the car and be presented with the option for a 360º view and complete specs, or to set up a test drive at a local dealer; what if you could roll over other items in the commercial and find out more information about them?

2. A movie
Let’s take one of my favorites, The Thomas Crown Affair:  what if you could hover over Catherine Banning’s favorite green drink concoction and find out what it was; what if you could buy the movie soundtrack with a swipe of your remote; or what if you could find out the back-story about the house that they snuck away to in the Caribbean?

3. A sporting event
Think about watching a baseball game on a lazy summer afternoon:  what if you could hover over the player and get up to the minute stats; what if you could, with a click or two of your remote, check to see if tickets are available for the next time your favorite team is in town; what if you could find out more information on the products that your favorite players are using?

Would brands find integration into the above-mentioned experiences more valuable? Would consumers find value in them and the overall TV experience more engaging?  Is there tangible business value to be unlocked here?  I think yes to all.  The difference between what I am describing and what exists today is that the experience is a semi-passive, seamless one that is integrated directly into and overlayed on top of programing.  There is no unwanted intrusion. Consumers only engage with the experience when they want to, in the way that they want to.

The possibilities abound.  In the end I’m not sure how it will all play out.  However, I know that the TV manufacturers are not going to win this one.  The winner in this game is either going to be a new startup, Facebook, the cable giants, or a combination thereof.  Experience TV FTW.

 

User Experience Architecture – Working Through Screens

100 Ideas for Envisioning Powerful, Engaging, and Productive User Experiences in Knowledge Work.

User Experience is becoming more essential as we start to embrace devices such as the iPad, Smart Phone technology as well as the more traditional PC/MAC work in the corporate world. It is also necessary as become more demanding of web-based applications and systems.

For all you designers & architects out there interested in this topic – a good reference which I’d thoroughly recommend to help with usability & web design is “Working through Screens Book”.

This is available here – http://www.flashbulbinteraction.com/WTS.html.

If you’re interested – I have a PDF version of this book which I’d be happy to share. Would appreciate any comments/thoughts.

Enjoy!

Solvency II

REF: http://www.fsa.gov.uk/pages/About/What/International/solvency/index.shtml

Solvency II:

Solvency II is a fundamental review of the capital adequacy regime for the European insurance industry. It aims to establish a revised set of EU-wide capital requirements and risk management standards that will replace the current Solvency requirements.

News:

The European Commission publishes the technical specifications for the fifth quantitative impact study (QIS5), see the FSA’s QIS5 page for further information.

The Insurance Sector Newsletters contain useful information for firms about the FSA’s approach to moving from ICAS to Solvency II.

The FSA publishes Delivering Solvency II – an update that summarises the key policy developments and implementation activities.

The Solvency II Directive is due to be implemented on 1 November 2012. Any changes to the go live date will be formally communicated by the European Commission, when the FSA will consider and communicate the potential impact on planning and preparations for itself and firms.

Application:

The Solvency II Directive will apply to all insurance and reinsurance firms with gross premium income exceeding €5 million or gross technical provisions in excess of €25 million (please see Article 4 of the Directive for full details).

In a nutshell:

  • Solvency II will set out new, strengthened EU-wide requirements on capital adequacy and risk management for insurers with the aim of increasing policyholder protection; and
  • the strengthened regime should reduce the possibility of consumer loss or market disruption in insurance.

Central elements:

Central elements of the Solvency II regime include:

  1. Demonstrating adequate Financial Resources (Pillar 1): applies to all firms and considers key quantitative requirements, including own funds, technical provisions and calculating Solvency II capital requirements (the Solvency Capital Requirement -SCR, and Minimum Capital Requirement -MCR), with the SCR calculated either through an approved full or partial internal model, or through the European standard formula approach.
  2. Demonstrating an adequate System of Governance (Pillar 2): including effective risk management system and prospective risk identification through the Own Risk and Solvency Assessment (ORSA).
  3. Supervisory Review Process: the overall process conducted by the supervisory authority in reviewing insurance and reinsurance undertakings, ensuring compliance with the Directive requirements and identifying those with financial and/or organisational weaknesses susceptible to producing higher risks to policyholders.
  4. Public Disclosure and Regulatory Reporting Requirements (Pillar 3).

Adoption procedure:

Solvency II is being created in accordance with the Lamfalussy four-level process:

  • Level 1: framework principles: this involves developing a European legislative instrument that sets out essential framework principles, including implementing powers for detailed measures at Level 2.
  • Level 2: implementing measures: this involves developing more detailed implementing measures (prepared by the Commission following advice from CEIOPS) that are needed to operationalise the Level 1 framework legislation
  • Level 3: guidance: CEIOPS works on joint interpretation recommendations, consistent guidelines and common standards. CEIOPS also conducts peer reviews and compares regulatory practice to ensure consistent implementation and application.
  • Level 4: enforcement: more vigorous enforcement action by the Commission is underpinned by enhanced cooperation between member states, regulators and the private sector.

The Level 1 Directive text was adopted by the European Parliament on 22 April 2009 and was endorsed by the Council of Ministers on 5 May 2009, thus concluding the legislative process for adoption. This was a key step in the creation of Solvency II.  The Directive includes a ‘go live’ implementation date of 1 November 2012 for the new requirements, which will replace our current regime.

Delivering Solvency II:

In June 2010 we published Delivering Solvency II giving a summary of the key policy developments and implementation activities.  The first issue includes: Completing the fifth QIS; Deciding to use an internal model; Reporting, disclosure and market discipline (Pillar 3); System of Governance; Getting involved in FSA forums; and Key contacts.

Delivering Solvency II

New Web 2.0-inspired Workforce Tool from TIBCO

Workplace Collaboration Tool

As an ex-TIBCO employee – I’m always interested in new product launches which are always innovative. Read on.

December 7, 2009 – Today, TIBCO Software Inc. announced their new workplace communication tool called tibbr. Differing from enterprise search tools, tibbr offers a subject-based approach in which relevant information finds the users who need it.

With tibbr, built using TIBCO Silver – an application delivery platform for cloud computing – TIBCO aims to filter out unwanted information by keeping the focus on subjects. A tibbr subject can be a user, an application or a process relevant to a business user. Users get their information by subscribing to the needed subject feed.

“Employees don’t suffer from a lack of information; in fact, too much information is hurting employee productivity,” David Mitchell Smith, vice president and Gartner Fellow at Gartner Research was quoted in the release. According to Smith, workforce collaboration processes hold the most promise for productivity gains.

tibbr integrates with the enterprise system via a standards-based API that embeds enterprise social capabilities into systems or applications, and maps applications to subjects with scalability, security/encryption and high availability.

“By combining the best of social networking with TIBCO’s real-time expertise, business users can leverage tibbr to filter through the noise and receive information only once, making it manageable and meaningful,” said Ram Menon, executive vice president, TIBCO.

TIBCO employees will implement tibbr this month, and general availability is scheduled for early 2010.

Reading List

Read more about enterprise social networks in these recent Information Management articles.

IDC: Social Media in the Workplace

Get Real: Social Networking in the Workplace

How to Measure Enterprise Information Management Progress

Introduction

Increasingly, companies are embarking on a comprehensive enterprise information management program to address the growing demands on data coming from regulators, lawmakers and internal business executives. The goals of these programs include higher data quality, more transparency and control, faster access to information, and better insight into internal operations and customers. Compounding the task is the growing volume of data and the increasing knowledge required to handle the sophisticated data technologies.

While these demands can be handled separately, the best companies are realizing the same customer, product and financial data will likely be involved in multiple projects with conflicting schedules and different priorities. Individual data management programs with unique tools, processes and people are also expensive. Each project can require the involvement of the same key subject matter experts, busy people with specialized knowledge of the data and the processes used to create it.

Hence, the business value of an EIM program is the coordination, prioritization and implementation of a broad set of business and IT initiatives that plan and manage critical data holistically and efficiently across the company. A sound EIM program has the following components:

Data strategy: The company’s vision and goals for the data environment are best represented by a comprehensive data strategy. The strategy includes the technical and business direction for the critical data of the company. Because it is aligned with the company’s business goals, every change to corporate strategy requires that EIM be re-evaluated.

Enterprise governance: Governing data requires data definitions, standards, policies and controls. Included in governance are the various forums for decision-making as well as the responsible roles and the people accountable for the data programs.

Metrics/controls: Agreed-upon goals to be achieved by the EIM program are measured to ensure success.

Data quality: Programs that continuously measure and improve data quality dimensions, such as accuracy, validity, completeness, timeliness and consistency, demonstrate the value of the EIM program.

Skills: Hiring and training skilled information management professionals, in both IT and the business, to carry out the data initiatives is foundational to enterprise governance.

Enterprise data services: Enterprise data services are common tools and methodologies available to business and IT users of data and encapsulate best practices, facilitate reuse and contain costs. Examples of these services include metadata services, search/create/delete processes, ad hoc reports and data mart development. An often-overlooked set of services is the internal communication forums necessary to keep employees informed on the EIM program.

Trusted data sources: High quality, certified, common data sources are to be used across the company, including master data and the enterprise data warehouse.

An EIM program is broad by its very nature. EIM is a collection of multiphase, multiyear initiatives where responsibilities, processes and technology help create change. Core funding and a dedicated team are necessary to implement the components of the program and manage its progress.

An EIM Scorecard

How should progress be measured and communicated to senior management and to those who are funding the data programs? An EIM scorecard is one solution.

Similar to a balanced scorecard with key performance indicators, an EIM scorecard is implemented annually to measure progress against the EIM strategy and its various components. Once an overall EIM scorecard is established, individual scorecards can be developed outlining the contribution of various groups to the year-end metrics. A scorecard can be created for a local department, function or project that ties to the overall year-end enterprise metrics. If your organization has business data stewards, then develop scorecards at a business data steward level. A scorecard is also an effective technique for measuring the business data steward’s effectiveness.

Scorecard Metrics

As in the balanced scorecard KPI project, selecting the right metrics is critical. Determining the appropriate targets requires collaboration across the various owners of the metrics and the EIM program owner. Metrics should be easy to understand and reasonably easy to track. EIM scorecard metrics can be defined in the following categories.

1. Data infrastructure metrics measure the progress toward the technical data strategy vision. Because most companies have legacy duplicate data that drives data quality issues, data integration issues and costs, reducing these data stores and using corporate-approved trusted data is an indicator of progress. Additionally, target to reduce the total cost of the hardware, software and resources of the data infrastructure of the company to improve utilization and efficiency.

2. Data control metrics define new data standards, policies and processes that are necessary to manage data effectively. When new controls are defined, affected departments must comply in a certain time frame with supporting plans. Data control metrics measure compliance plans as well as any compliance testing results. The completeness of the enterprise metadata repository can also be measured in this category. All important data stores should have an entry in the repository with minimum information established by the EIM governance program.

3. The organization maturity metric measures the progression in training, skill development and role staffing. Any EIM strategy necessitates new skills and responsibilities in data management. Consider using one of the industry information management maturity tools to baseline the current data capabilities of the company and establish an annual improvement plan. The maturity tool can be administered internally or through a consulting company and includes a survey of internal stakeholders.

4. Issue management, such as logging data issues consistently across the company and addressing the high impact issues, is a critical barometer for management. Data issues that arise from internal audits, security testing or operational incidents are of special concern to the company and should be monitored via the scorecard. Logging issues in a consistent fashion also allows visibility into trends and the ability to identify hot spots to be improved on an annual basis.

5. Data quality improvement is a fundamental component of an EIM strategy. During the first year, the company would most likely baseline the dimensions of quality that need improvement. In subsequent years, projects are funded and more dimensions of data quality can be slated for improvement. This metric measures the year-end improvement plans against a set target. If possible, create an aggregate score of data quality for the firm, because too many metrics are often confusing.

6. Financial/cost improvements are at the heart of a scorecard. Clearly, no scorecard is complete without a set of financial or cost metrics that validate a solid ROI. Tracking individual project costs as well the overall business case for EIM would be included in this metric.

Each metric category has an assigned owner. The owner is the person or organization in the best position to affect change. The owner is accountable for establishing the baseline metrics as well as the year-end improvement targets and plans. The EIM program manager drives and owns the scorecard planning and reporting process. Creating quarterly or monthly interim metrics, where possible, provides management with early warning signs if the metrics go off track and provides an opportunity for remediation. It is highly recommended to automate the collection of the metrics, especially if interim metrics are necessary.

Communicating the progress of data initiatives to business leaders and executives is challenging and requires a clear and conciseformat. The EIM scorecard is emerging as an effective tracking formatthatprovide aconsistent mechanism to show yearly progress and communicate results.

This article was originally written by Mariah C. Villar.

Biography

Maria C. Villar is managing partner at Business Data Leadership. Maria Villar is an IT professional with more than 25 years of experience in IT, technology re-engineering and enterprise data management. She has held senior executive positions in both the technology and financial sector that included responsibilities for data quality, governance, architecture and database technology solutions. She built the first company-wide Enterprise Business Information Center of Excellence at IBM. The COE was recognized externally by for best practices in data governance and business intelligence applications. Villar has been recognized in Hispanic Business Magazine as one of the Top 100 Influential Hispanics and received the Distinguished Hispanic IT Executive award from Hispanic Engineer National Achievement Awards Conference.

The 5 R’s of Data Migration

A migration solution must have the following characteristics:

Robust and resilient

Manage all aspects of the data extraction, transformation, cleansing, validation and loading into the target — and manage high volumes of data, errors in source and target connections, and disk space and memory problems.

Rapid

Execute efficiently and take advantage of existing source or target facilities to enable rapid processing.

Reporting

Provide progress indicators during migration and reconcile the completed process.

Recoverable

Recover from the point of failure when necessary.

Reusable

Ability to reuse components of the migration in other projects, including transformation functions, error handling and data cleansing routines.

Tips to Jump-start Your Data Quality Initiative

It’s clear that nobody wants bad data, yet it is a costly reality that is often ignored.

Here are some tips to help you jump-start your own initiatives.

Understand the bus iness context

Successful data quality initiatives are driven by the requirements of business initiatives.
By starting the IT-business conversation, you can confirm the business context and learn which data is needed.

Discover and profile data

Data discovery provides insight into whether you have the data you need, and data profiling examines the structure, relationship and content of existing data sources to create an accurate picture of the state of the data. This helps in planning the best ways to correct or reconcile information assets to answer the business questions at hand. Data discovery and profiling technology can be deployed in-house, or provided by a professional services organization.

Monitor data qua lity

You need oversight to ensure that data quality efforts aren’t degraded by the creeping return of errors, such as the introduction of incorrect or nonstandard data. Active data monitoring in profiling reports and scorecards, for example, could generate email or system alerts when certain conditions are met, such as a high percentage of exceptions or nonstandard data. Another, more dynamic, approach entails applying methods used to cleanse and enhance data in real time as it enters and moves through the enterprise.

Implement a data qua lity methodology

The methodology should include the processes and technologies used to create and maintain the quality standards specified by the business rules. It should include discovery/profiling and monitoring as mentioned above, as well as processes for accessing and modifying data from diverse sources; correcting, standardizing and validating data;  and enhancing existing data by incorporating external information.