Friday, November 11, 2011

QNXT eXtended Integration (QXI) Services

Available with QNXT 4.81 or later, is a new integration option called QNXT eXtended Integration (QXI) services.  QXI services, built on the .NET WCF platform, are web services which allow for application integration with QNXT.  These are a big deal.  Up until this point, the most prevalent integration option was a direct database integration or using the QNXT SDK.  The former option was a very manual code intensive integration, and the latter option was not very robust and offered very limited functionality.  With QNXT 4.81, QXI appears to be the preferred choice for integrating QNXT with 3rd party or proprietary (home grown) applications.  Out of the box, integration features are available for:
  • Financial A/R Systems
  • Provider Credentialing
  • Underwriting Solution
  • Homegrown Employer/Provider/Member Web Portals (PTS)
  • Web Enrollment for Membership and Employers
  • Scanning/ Document imaging retrieval
  • PBM
  • Third Party FSA vendors
  • HEDIS Reporting
  • IVR / Customer Service
  • External Case Management Solutions

I am looking forward to kicking the tires on these services and posting a review






Tuesday, October 18, 2011

The Future of Health Insurance

There are many opinions on this topic, below is an outline of mine.  I'm sure it will continue to evolve
  • In the next decade, the health care marketplace will be significantly different than it was in the last decade

  • The role of health plans is therefore going to be significantly different

  • Success in this new market is going to take bold leadership. Bold choices will need to be made. New ground broken

  • If we’re able to put (national and local) politics aside, the focus of health care reform will be on
    • People taking responsibility for their own health
    • Working together to make communities healthier
    • Providing better/smarter choices to consumers enabling them to live a healthier life
    • Sharing the accountability, risk and cost associated with providing care

  • These initiatives are what will reduce cost and improve the quality of care. No one has figured out all the details of how best to do this yet, but one thing is clear:

  • Information (data) is the most valuable asset companies have in this future

  • Health plans will transform themselves from insurance companies into information companies. A central component of their business plan and strategy will be information and information management. The type of information health plans own, and the quality of that information, is what will differentiate them from their competitors.

  • Being able to manage information effectively today is crucial in making this transformation in the future

  • Key (company) performance indicators in this future will be directly related to data and the quality of data. Every person within every department of an organization will be responsible and accountable for meeting performance goals which revolve around data

Friday, February 4, 2011

Lessons Learned Retrospective from our TriZetto QNXT Implementation

This year we are upgrading to the next major version of QNXT. It is another large implementation effort. This week I prepared for a “lessons learned” session with the upgrade team. Here are my impressions on what we did right, and what we aspire to do better
Commit Resources

  • We allocated dedicated resources to the QNXT implementation program which was important for a program of this size
  • The program manager took a hard line on not letting any resources go – which was great because an effort of this size requires a full commitment
  • What didn’t work - anything lower than a 100% allocation to the program. In our environment, everything is an urgent number one priority. So if you were an individual who was 50% allocated to the program, what that really meant is that you were 100% allocated to two number one priority efforts. The QNXT upgrade program is of the size and complexity that focus and concentration by all parties involved is required

Pay For Performance

  • We paid out a performance bonus for key milestones in the program: design freeze, entrance to model office, go live and stabilization. Meeting these milestones was hard. And in the 11th hour we still had work to do to make our targets for every milestone.
  • Everyone’s level of effort was a 10 of out 10. I think the performance bonus worked. It provided an added incentive to get the job done

Get Feedback From the Business Users Early

  • We started the design phase on January 1, 2008. We started testing on July 1, 2008 and ran testing through January 31, 2009. What that means is in the worst case – the business defined their need in January of 2008, and we didn’t show them a solution until 12 months later to collect their feedback. Any by that time – their needs changed.
  • The way we need to operate from the beginning is exactly how the QNXT implementation program ended. We should have short cycles 4 to 6 weeks, where we implement shorter more focused design, build and testing phases
  • At the start of the program we broke down the work and allocated teams by function: interfaces and extracts, enterprise data warehouse, web, reports migration, configuration. At the end of the program, when we were nearing go live, we reorganized by subject area: eligibility, premium billing, customer service, claims/finance – so we could focus on bringing one subject area live at a time. For the upgrade, we should break this work down by subject area from the beginning and save our selves the time of reorganizing later. Our functions (interfaces and extracts, configuration, etc,.) will still exist and they will cut across the subject area teams

Sit Together

  • We put our entire web team (the only project track which finished on time) in one room
  • I don’t think we can put a price on the collaboration that will happen naturally when people sit together. We had business analysts working side by side with developers and testers. If we could have had the business subject matter experts there it would have been even better.

Three Rules About Testing

  • Rule #1 Test with real data
  • Rule #2 if you can’t test with real data, see rule #1
  • Rule #3 automate your tests

Automated testing will allow us to run more cycles faster and thereby yield a higher quality deliverable in a shorter amount of time

In order to automate testing, we need to get to a more granular level of detail in our business process flow documentation.


Other Notables

  • Track actual time on deliverables
  • Start knowledge transfer and ownership turnover early
  • Automate everything we can automate

Thursday, November 12, 2009

TriZetto Hosted Batch Architecture

TriZetto Hosting has a framework in place for executing batch jobs – and that framework is called the Hosted Batch Architecture or HBA for short. So if you are a TriZetto Hosting customer, and are writing custom code to extend/enhance the core functionality of QNXT or Facets, you will come to a cross roads where you will need a job created to execute this custom code. And this is where the HBA comes into play. You will have to make sure your job is HBA compatible.

Before I get into what it means to be HBA compatible, I’d like to bring up the benefits the HBA provides us. The HBA provides a common look and feel for the execution and supportability of all jobs. This is a very important principle in custom development practice – keeping things consistent and the same. This means that all jobs will be kicked off the same way, they will use the same resources like I/O and database connections, and in the event that an error occurs, errors will be handled in the same manner. This is a very powerful benefit for TriZetto customers in my opinion. This reduces the burden and the cost of troubleshooting problems when they arise.

 The HBA in it of itself is a “black box.” So I can not really tell you what the HBA is. The TriZetto documentation refers to is a framework, so that is what I call it here. From a customer’s point of view, I can share with you that jobs can be configured using XML, and that you can use VB Script to extend the capabilities of the HBA if the need arises.


Creating jobs that are HBA compatible

Note: this section is entirely QNXT specific, and speaks to version 4.51 of the HBA

In a previous post, I discussed customizing QNXT using the Custom Database. Once you have a Custom Database setup, you will need custom code to take various read and write actions against that database. A common approach is to write this custom code into SQL Server Integration Service (SSIS) Packages. SSIS allows you to extract, transform and load data in and out of SQL Server.

Within an SSIS Package you define Connection Managers for any type of resource that will be consumed or published by the SSIS Package. For example, you will need Connection Managers for database connections, file system locations (like an /input or /output folder for example), SMTP and any other resource that will be part of the workflow of the package. A common practice in SSIS development is to place the values of these resources into a configuration file called .dtsConfig, and bind these configurations in at runtime. This way, if you have varying configurations depending on which environment (development, test or production) the package runs in, you don’t have to change code in the package to deploy to these environments.

In order to use a configuration file, you have to define Package Variables within your SSIS package. And use these Package Variables when creating Connection Managers so that resource names are never hard coded into the package. At run time, when the configuration file is bound in, the Package Variables are populated with the appropriate values for that environment, and the Connection Managers are ready to be consumed by and published to.

With the HBA, you employ a similar practice. The key differences are that:

(a) The configuration file is not a .dtsConfig. Instead it is a proprietary TriZetto file called Common System Properties. And if any local configurations are needed, there is a separate configuration file called the HBA Wrapper also referred to as the job XML

(b) The Package Variables must be placed within an HBA namespace

TriZetto provides an HBA Development Guide which details all of the particulars of how to build HBA compliant SSIS Packages, which includes, among other things the Package Variables you must use.  So becoming HBA compatible means employing these practices in your custom code.

Recommended Reading

If you are a TriZetto customer and are looking to familiarize yourself with the HBA, in addition to reading the HBA Development Guide, I recommend you also request information about:

  • Common System Properties - this should will come in the form of a list of variables and their meaning
  • HBA Wrapper Dev Guide - this has a list of commands you can execute inside of the HBA.  For example, there are command for file manipulation like fileDelete() and fileCompress().
  • Client Specific - you can enhance the functionallity of the HBA with VB Script be creating a Client Specific VB file that is imported into the HBA
  • Auto Date Parameters - there are a series of data variables that are predefined and available to you in the HBA.  For example, there is a date variable that always returns the first date of the current quarter, or the last day of the month 
Stay Tuned ...

In a future post, I will discuss a unique challenge that you have to solve with respect to HBA.  Since the HBA is only available in your hosted environments, you need a mechanism to unit test your code to make sure that you are consistently creating HBA compatible custom code.

Wednesday, November 11, 2009

Creating an Authorization or Referral with QNXT 3.4 SDK

When you install a QNXT EU (Execution Unit), you also get the QNXT SDK. The QNXT SDK is really just a Microsoft Word document that you will find in the \Program Files directory after the install is finished. You do not have to select anything special as part of the install process, the SDK is installed by default.

This Word document lists a series of assemblies that can be used to harness some of the functionality of the Execution Unit. This document does not go into great detail as to how to use these assemblies. Also, no code samples are provided. So you are left more or less to your own devices to figure out how to get what you need out of the SDK.

This blog post describes how we used the SDK to create authorizations/referrals using the QNXT 3.4 SDK. Near as I can tell what I am describing here is not documented anywhere else, including the TriZetto Customer Exchange Website. This is unsupported custom code, but it has passed Trizetto code review, which may not mean much to you unless you are a TriZetto Hosting customer. If you are Hosting customer, then passing code review means that TriZetto will deploy this code to your production environment.

There are two pre-requisites to get the sample code provided in this blog post to run:

  1. Install the QNXT 3.4 EU locally
  2. Create a QNXT User with (a) the appropriate QEnvironment added i.e. Integrated Dev and (b) with the “Authorization Assignment” role added/granted

Next, you will need to bind in the following reference assemblies to compile this code

  • QCSI Globals.dll
  • QCSI Internal Proxy.dll
  • QCSI Messages Authorization.dll
  • QCSI Messages CommonTypes.dll
  • QCSI Proxy Authoriztion.dll
  • QFrame Common Messages.dll

All of the above are installed into the GAC on the EU. If you are using Visual Studio to build your project, you will need to xcopy these out of the GAC first so that they will show up in your Add References wizard.

Without any further ado - here is a working unit test to create an authorization/referral using the SDK
using System;
using Q.QFrame.Messages;
using Q.Proxy;
using Q.Global;
using MbUnit.Framework;

namespace AuthPosting.Tests
{

[TestFixture]
public class AuthPostingTests
{

[Test]
public void CreateReferralTest()
{
QNXTProxy _proxy = new QNXTProxy();
if (!_proxy.InitQNXT("user", "password", "environment", "plandataAlias")) Assert.Fail();

AuthCreateUpdateRequestMessageType _request = new AuthCreateUpdateRequestMessageType();
_request.Referral = new ReferralMessageType();
_request.Referral.EnrollId = "";      // QNXT Enrollment ID
_request.Referral.MemId = "";       // QNXT Member ID
_request.Referral.ServiceCode = ""; // QNXT Template ID
_request.Referral.Cob = 0;
_request.Referral.ReferTo =""; // QNXT ReferTo Provider Id
_request.Referral.EffDate = DateTime.Today.ToString();
_request.Referral.ReferFrom = ""; // QNXT ReferFrom Provider Id
_request.Referral.Emergency = 0;
_request.Referral.ReferralDate = DateTime.Today.ToString();
_request.Referral.TransferInOut = 0;
_request.Referral.AdmitDate = DateTime.Today.ToString();
_request.Referral.IssueInitial = "webusr";
_request.Referral.DischargeDate = DateTime.Parse("2078-12-31").ToString();
_request.Referral.TermDate = DateTime.Parse("2078-12-31").ToString();
_request.Referral.PayToAffiliationId = ""; // QNXT PayTo Affiliation ID
_request.Referral.AttProvid = ""; // QNXT Attending Provider Id
_request.Referral.AppealDate = DateTime.Parse("2078-12-31").ToString();
_request.Referral.AccChg = Convert.ToDecimal(0.00);
_request.Referral.Acuity =  "Urgent";
_request.Referral.Admit = 1;
_request.Referral.AdmitDate = DateTime.Today.ToString();
_request.Referral.AdmitPhys = "";
_request.Referral.AdmtProvid = ""; // QNXT Admitting Provider Id
_request.Referral.AuthStatus = AuthStatusType.MEDREVIEW;
_request.Referral.Diagnosis = "";
_request.Referral.DisDiagnosis = "";
_request.Referral.Dispositionid = "";
_request.Referral.ProcessLogId = "";
_request.Referral.ReceiptDate = DateTime.Parse("2078-12-31").ToString();;
_request.Referral.ReferToLocation = "";
_request.Referral.ReferToPar = YesNoType.N;
_request.Referral.ReferToProvType = "";
_request.Referral.Source = ReferralMessageTypeSource.Q;
_request.Referral.DecrementType = ReferralMessageTypeDecrementType.SVC;
_request.Referral.Status = AuthStatusType.INPROCESS;

_request.Referral.ReferralText = new ReferralTextMessageType();
_request.Referral.ReferralText.Reason = _referral.Reason;

switch("Auto")
{
case "Auto":
_request.Referral.AccidentCause = ReferralMessageTypeAccidentCause.A;
break;
case "Employment":
_request.Referral.AccidentCause = ReferralMessageTypeAccidentCause.E;
break;
case "Others":
_request.Referral.AccidentCause = ReferralMessageTypeAccidentCause.O;
break;
case "No":
default:
_request.Referral.AccidentCause = ReferralMessageTypeAccidentCause.Item;
break;
}

_request.Referral.AuthDiags = new AuthDiagMessageType[1];
_request.Referral.AuthDiags[0] = new AuthDiagMessageType();
_request.Referral.AuthDiags[0].DiagCode = "100.0";
_request.Referral.AuthDiags[0].Sequence = "1";
_request.Referral.AuthDiags[0].DiagQualifier = AuthDiagMessageTypeDiagQualifier.PRINCIPAL;


AuthCreateUpdateProxy _authProxy = new AuthCreateUpdateProxy(_proxy.Session);

AuthCreateUpdateResponseMessageType _response = authProxy.ProcessMessage(_request);
}



}
}
}

Note that AuthCreateUpdateProxy() method does not perform any data validation. So if this is important to you, you will have to perform this outside of the SDK. Also, the SDK does not support:

  • Adding data to custom attributes
  • Creating an authorization service
  • Creating an authorization alert/memo

These operations will also need to be performed outside of the SDK.

Here are is a list of some other tests we found to be useful in insuring the quality of the SDK interface:

  • Add authorization with wrong or missing Member Id
  • Add authorization with wrong or missing Enrollment Id
  • Add authorization with wrong or missing Auth Template Id
  • Add authorization with wrong or missing Refer From Provider Id
  • Add authorization with wrong or missing Refer To Provider Id
  • Add authorization with wrong or missing Affiliation Id
  • Add authorization with wrong or missing Admitting Provider Id
  • Add authorization with wrong or missing Admitting Provider Id
  • Add authorization with wrong or missing Diagnosis Code
  • Add duplicate authorization

Thursday, July 30, 2009

Custom NAnt ILMerge Task. No Assembly Required

ILMerge is a Microsoft utility that enables you to incorporate referenced .NET assemblies into a combined assembly, removing the need to distribute external assemblies. If you are using ILMerge, and you have an automated build process using NAnt, you have several options for integrating ILMerge into your build:

1. You can you use the <exec> task and build your command line arguments as a string.

2. You can use a 3rd party (or write your own) custom tasks library like this one: http://code.google.com/p/ilmerge-tasks/wiki/HowToUse

3. Or you can write a inline custom task using the <script> tag in your NAnt script


I started with option #1. I quickly realized that without <fileSet> support, this could become a difficult option to maintain.

I then moved to option #2. This library didn’t work for me. I suspect it’s because my \tools folder, which contains ILMerge.exe and the ILMergeTask.dll are both on an R:\. And I suspect this is troublesome for the .net reference search. I get back an error that ILMerge was not found.

Which left me with option #3. NAnt presents a <script> tag which allows you to write extensions to core functionality. So I wrote my own ILMergeTask. The thing to note here is that I could have compiled this into a task library as well. The real difference verse option #2 is the ILMergeTask I wrote extends ExternalProgram (as opposed to Task). Which allows me to pass in the path of ILMerge.exe.

Usage is pretty familiar:

<ilmerge outputfile="Combined.Assembly.exe"
program=”R:\tools\ilmerge\ilmerge.exe”
primary=”Primary.Assembly.exe”
log=”${log.dir}\ilmerge.log”>
<assemblies>
<include name="*.dll" />
</assemblies>
</ilmerge>






And here is the code for the custom task:
  <script language="C#" prefix="custom" >
<references>
<include name="System.dll" />
<include name="NAnt.Core.dll" />
</references>
<imports>
<import namespace="System" />
<import namespace="System.Collections" />
<import namespace="System.Collections.Specialized" />
<import namespace="NAnt.Core.Types" />
<import namespace="NAnt.Core.Util" />
<import namespace="NAnt.Core.Tasks" />
</imports>

<code>
<![CDATA[
[TaskName("ilmerge")]
public class ILMergeTask : ExternalProgramBase {

private FileSet m_assemblies;
private string m_logFile;
private string m_outputFile;
private string m_primaryFile;


[TaskAttribute("program", Required = true)]
[StringValidator(AllowEmpty = false)]
public override string ExeName
{
get { return base.ExeName; }
set { base.ExeName = value; }
}


public override string ProgramArguments
{
get { return string.Empty; }
}

[BuildElement("assemblies", Required=true)]
public virtual FileSet InputAssemblies
{
get
{
return this.m_assemblies;
}
set
{
this.m_assemblies = value;
}
}

[TaskAttribute("logfile")]
public virtual string LogFile
{
get
{
if (this.m_logFile == null)
{
return null;
}
return this.Project.GetFullPath(this.m_logFile);
}
set
{
this.m_logFile = StringUtils.ConvertEmptyToNull(value);
}
}

[TaskAttribute("primary", Required=true), StringValidator(AllowEmpty=false)]
public virtual string PrimaryFile
{
get
{
if (this.m_primaryFile == null)
{
return null;
}
return this.Project.GetFullPath(this.m_primaryFile);
}
set
{
this.m_primaryFile = StringUtils.ConvertEmptyToNull(value);
}
}

[TaskAttribute("outputfile", Required=true), StringValidator(AllowEmpty=false)]
public virtual string OutputFile
{
get
{
if (this.m_outputFile == null)
{
return null;
}
return this.Project.GetFullPath(this.m_outputFile);
}
set
{
this.m_outputFile = StringUtils.ConvertEmptyToNull(value);
}
}


protected override void ExecuteTask()
{
try
{
Log(Level.Info, "Executing ILMerge.exe");
Log(Level.Info, string.Format("/out:\"{0}\"", m_outputFile));
Log(Level.Info, string.Format("/log:\"{0}\"", m_logFile));
Arguments.Add(new Argument(string.Format("/out:\"{0}\"", m_outputFile)));

Log(Level.Info, string.Format("assembly[{0}]: {1}", "primary", m_primaryFile));
Arguments.Add(new Argument(string.Format("\"{0}\"", m_primaryFile)));

for (int i = 0; i < m_assemblies.FileNames.Count; i++)
{
Log(Level.Info, string.Format("assembly[{0}]: {1}", i, m_assemblies.FileNames[i]));
Arguments.Add(new Argument(string.Format("\"{0}\"", m_assemblies.FileNames[i])));
}

Arguments.Add(new Argument(string.Format("/log:\"{0}\"", m_logFile)));

base.FailOnError = false;
base.ExecuteTask();
}
catch (Exception ex)
{
throw new BuildException(string.Format("Error executing ILMerge {0}", "test"), Location, ex);
}
}
}
]]>
</code>
</script>




Big kudos to the guy on google code and the guy that wrote reflector ;-)

Friday, July 17, 2009

How to harden your iMac Apache Web Server

Note: I am running Leopard 10.5.6 and Apache 2.2. Other versions of OS X and/or Apache may work differently.

A few months ago I started Apache Web Server on my iMac and today I’m going to apply the “minimum necessary” configuration to Apache, so it is more secure. In a future posts, I plan to write about putting your Apache Web Server "in Jail," and enabling DDNS ... so that your hardened iMac will be a web host on the Internet.

Prerequisite
When I first bought my iMac, I secured it using Apple’s Security Configuration as a guide. If you buy your iMac at Best Buy, you can also pay Geek Squad $40 to do this for you. If you’re looking for an abridged version, this guide is also good - http://www.macshadows.com/kb/index.php?title=Hardening_Mac_OS_X. Also, Apple puts out an audit tool with lots of other security tips - http://support.apple.com/downloads/Common_Criteria_Tools_for_10_4.

The first thing to do before securing your Apache Web Server is to decide on the functionality you want from Apache. I decided on some very basic items:

  1. Only static HTML pages will be served.
  2. Ok, I’m adding SSI (http://httpd.apache.org/docs/2.2/howto/ssi.html)
  3. The server must support the virtual hosting mechanism.
  4. The server must log all web requests (including information about web browsers).
Configure Apache
To apply these configurations, you have to edit the web server configuration file

Open a terminal: Finder->Applications->Utilities->Terminal
Edit the configuration file: sudo bbedit /private/etc/apache2/httpd.conf

bbedit is a text editor that I bought. You can use any text editor here, even TextEdit which came pre-installed on my iMac.

You do not need to enable root password for this. If you are logged in as an Administrator User you can sudo and execute this command. My finished configuration is as follows (in the interest of space, I removed comments normally found in

ServerRoot "..."
Listen 80
LoadModule authz_host_module libexec/apache2/mod_authz_host.so
LoadModule include_module libexec/apache2/mod_include.so
LoadModule log_config_module libexec/apache2/mod_log_config.so
LoadModule expires_module libexec/apache2/mod_expires.so
LoadModule mime_module libexec/apache2/mod_mime.so
LoadModule dir_module libexec/apache2/mod_dir.so



User www
Group www



ServerAdmin esuyer at gmail dot com
UseCanonicalName Off
ServerSignature Off
HostnameLookups Off
ServerTokens Prod
DocumentRoot ".../www"
Timeout 300
KeepAlive On
MaxKeepAliveRequests 100
KeepAliveTimeout 15

MinSpareServers 5
MaxSpareServers 10
StartServers 5
MaxClients 150
MaxRequestsPerChild 0


Options None
AllowOverride None
Order deny,allow
Deny from all


Options +Includes
Order allow,deny
Allow from all


DirectoryIndex index.htm


Order allow,deny
Deny from all
Satisfy All


Order allow,deny
Deny from all
Satisfy All


Order allow,deny
Deny from all
Satisfy All

ErrorLog "/private/var/log/apache2/error_log"
LogLevel warn

LogFormat "%h %l %u %t \"%r\" %>s %b \"%{Referer}i\" \"%{User-Agent}i\"" combined
LogFormat "%h %l %u %t \"%r\" %>s %b" common


LogFormat "%h %l %u %t \"%r\" %>s %b \"%{Referer}i\" \"%{User-Agent}i\" %I %O" combinedio


CustomLog "/private/var/log/apache2/access_log" common



ScriptAliasMatch ^/cgi-bin/((?!(?i:webobjects)).*$) "/Library/WebServer/CGI-Executables/$1"


DefaultType text/plain


TypesConfig /private/etc/apache2/mime.types
AddType application/x-compress .Z
AddType application/x-gzip .gz .tgz
AddType text/html .shtml
AddOutputFilter INCLUDES .shtml


Include /private/etc/apache2/extra/httpd-vhosts.conf

SSLRandomSeed startup builtin
SSLRandomSeed connect builtin




Compared to the default configuration file, the following important changes have been made:

  • The number of enabled modules has been reduced to minimum.
  • Apache's processes (except for the root process) are set to be executed with unique regular user/group privileges.
  • Apache discloses the least information about itself as possible.
  • Access rights to the website's content are set to be more restrictive.