Dec 282006
 

The client that I am working at wanted email notification when incoming HIPAA files were processed. I was able to put this in place. The only issue is that when files would not validate, all I could do was to wait and then finally send out an email stating that the file did not validate in time. This would immediately spin up calls asking me why it did not validate, and my love of looking around for the TA1 and 997 and then interpreting it to the business analyst was not my favorite thing to do especially since I knew that a more human readable version was available. I finally had enough time to look into this. I had the Sender Qualifier (ISA05) Sender Identifier (ISA06) and the control number (ISA13). I finally determined the relationship between the audin table and the the errors table. I eventually determined that I needed the errtxt table also. I created the following stored procedure to capture the description and details as long as you provide the above mentioned values.

Again, this will work for both the BizTalkEDIDb and the BizTalk_HIPAAEDIDb as the table structures are the same.

CREATE PROCEDURE dbo.captureDetails
@senderQual varchar(2),
@senderId varchar(15),
@controlNumber varchar(9)
AS
SELECT errtxt.descrp AS description, errors.descrp1 AS details
FROM audin INNER JOIN
errors ON audin.icin = errors.msgnr INNER JOIN
errtxt ON errors.etc = errtxt.etc
WHERE (audin.sid = @senderId)
AND (audin.icr = @controlNumber)
AND (audin.sidcdq =@senderQual)
AND (errors.inout = ‘2’)
for xml raw–, xmldata
GO

 

*As a note, if a duplicate file (duplicate ISA13) is sent this stored procedure will return all of those records, the first one, and then the subsequent duplicated files, if you wanted to, you could capture the date from the audin table, but then it becomes a matter of tolerance on how long you want to go back.

Dec 212006
 

When setting up a process to deliver emails from BizTalk, treat I always configure the SMTP information correctly, I always (this has happened to me four times now) start testing it and it is continually failing. The following error is as follows:

The message could not be sent to the SMTP server. The transport error code was 0x80040217. The server response was not available.

The resolution is that I had forgotten to configure the Send Handler with the smtp server address. Even though the documentation states that the configuration is overridden in the send port configuration, you still need to set up the address initially.

Dec 212006
 

This is again one of the little known facts when troubleshooting your HIPPA or EDI test or production enviornment. Since the service does not store the original filename of documents that are brought in using the file or ftp recieve handler, you are stuck with having to search for files using the control number and sender/receiver ids. This is ‘officially’ how you are supposed to do it, but how many of us in the EDI/HIPAA world actually have been born and raised following the standards? Rules are made to be broken! Yes, even if we have to adhere to the guidelines for correct payments, there is always the enviornments that we are placed in where the client calls up and states, ‘where did my file go?’ Actually, come to think of it, I have never heard where a client calls in and states, ‘where did my interchange go?’ Yes if we were all using clearing houses it would be one thing, but most (not all) of the clients I have worked for do not. I guess that the client thinks, ‘why pay for a service that does not add any more value than a server that stores files for us that we can host ourselves?’

So the HIPAA and Base EDI stores the data in the %documentshome%\System\External\{Inbox|Outbox} with a continually incrementing number.in/.out filename. A while ago I attempted to search for a particular trading partners files, of course I did not know the filename, I just knew their id and their control number. Actually, I did not know their control number, but I did know their sender id, I just went into the …\External\Inbox\ and searched for * where the ‘A word or phrase in the file:’ I put in their sender id and clicked <Search> came up with nothing. Now I knew that files with that id were in there, but the search capability was not working. I asked the network admin and he said that it was because of a policy running on the servers and it could not be changed. I then had to resort to going to a DOS prompt.

New client, same issue. I called up the network admin and asked them if they could change the policy, smart guy, likable fellow, but his first word was ‘HUH?’ I explained the situation, that I could not find any files with the text containing their id were found. He placed a .txt file in the folder and filled it with a little known football team’s name ‘Raiders’ and then did a text search for the file. VIOLA-> it found it.

I then told him I would look into it. I found this article. I then found the entry in my registry : HKLM\SYSTEM\ControlSet001\Control\ContentIndex\ and I changed FilterFilesWithUnknownExtensions from Hexidecimal value from 0 to 1 and did my search again and it started finding the files.

This is going to help me track down EDI files significantly!

Dec 192006
 

During the configuration of BizTalk 2004, during the configuring of WMI, there is the following error:

Failed to deploy Biztalk system assembly “D:\Program Files\Microsoft BizTalk Server 2004 Microsoft.BizTalk.DefaultPipelines.dll”. Unspecified exception: Unable to generate a temporary class (result=1).
error CS2001: Source file ‘C:\WINDOWS\TEMP\zkf0npgt.0.cs’ could not be found.
error CS2008: No inputs specified.

 

Unable to generate a temporary class (result=1).

error CS2001: Source File ‘C:\WINDOWS\TEMP\zkf0npgt.0.cs’ could not be found

error CS2008: No inputs specified

 

The resolution is to give youself read/write access to the C:\WINDOWS\TEMP directory

Dec 122006
 

This blog entry has been literally a year in the making!

While working at a client, the requirement to seperate claims, decide which system it went to (QNXT or the exising mainframe system).

I decided that using the multiple 837 schema was the best approach. This means that the HIPAA accelerator is going to take the HIPAA file and submit claims (in XML format) individually to the message box. With those messages, I created a singleton orchestration process to pick up each of the messages. It would pick up each of the messages, and individually go through some calls to find out which system it went to.

Once the decision was made, I would concatonate the message to the rest of the messages that have already come in for this HIPAA transaction.

What I saw happening was that during the concatonation process it was taking longer and longer to append the current message to the rest of the messages.

Directions changed, and we moved from having BizTalk being the routing application for various reasons; speed of processing being one of the many reasons.

I worked at another client, and the same issue came up. We started off working with eligibility files (834) and I started with the same approach, and immediately saw the concatonation process increasing in time to complete as the more messages it processed. This time we were able to test with some significantly large files so I could get some real numbers to look at.

It started out taking 1 second to process the first subscriber in the 834, and then by the time it was down to subscriber 1000, it was taking 10 seconds to finish the process. I then needed to come up with a different approach, because these were relitivley small files, and we were looking at getting files that had 200,000 subscribers.

I thought, I need a way to store the data in a manner that will not continually increase in time as the dataset grew, what could I use? Well, a database table came to mind, I could place the data in a table, process them and then when it has all been completed, I then could just extract the data out of the database table and send it off.

I implemented sending the data to a database table and not concatonating the messages together. Once I started testing I immediately saw an improvement in performance! In looking at the details, I was seeing that the first subscriber took 1 second to process, so also did subscriber 1000!

I was not satisfied though: if each subscriber was going to take 1 second, then the graph below shows the time to process the file.

Subscribers Minutes Hours
1,000 16.67 0.28
2,000 33.33 0.56
3,000 50.00 0.83
4,000 66.67 1.11
5,000 83.33 1.39
6,000 100.00 1.67
7,000 116.67 1.94
8,000 133.33 2.22
9,000 150.00 2.50
100,000 1,666.67 27.78

The question then was, how do I process them faster? How could I send them to the database faster than what improvements I have already done. I might be able to optimize the extraction and sending to the database a little, but even if I were to cut it in half, I would still be looking at 13 plus hours to complete a single file.

What if I ran multiple occurrences of the extraction process at the same time? I would break my singleton orchestration, but I would essentially open up the flood gates, and it could process as many messages as possible at the same time. The next question then came to mind, what about the very distinct possibility that there would be table locking issues as I would be doing multiple inserts into the same table at once? I need a highly optimized process to insert data into the database that can handle the possiblity of many inserts happening at once. I am also not a database guru, so I needed something that someone else has developed that I can implement.

BAM – it hit me. BAM (Business Activity Monitoring) is optimized to accept many messags and insert them into a table and it definately has to be designed to capture many messages at the same time. There are two flavors of BAM that can be invoked from 2004, DirectEventStream and BufferedEventStream. I decided that because using DirectEventStream would cause performance issues, going to the BufferedEventStream route would be possibly the best approach. So I have many messages being processed, and then BAM data is sent to the MessageBox to be inserted into the BAMPrimaryImport database when BizTalk got around to it.

I implemented this approach, and increased the processing speed from 1 subscriber per second to 10 per second!

The next issue was, how do I know when it is complete and when can I extract the data from the BAM database? I needed a monitoring service to watch and see when inserts were done for this file and once it has completed, extract the data and create the output.

What if I had each of the processes that sent data to BAM send a message to another orchestration and consumes those messages, as soon as the messages quit coming, go and check the database to make sure that the rows are there, as soon as all of the rows are there, then extract the data.

This is where I thought would be a very simple process, it ended up being yes (kinda), but I normally have to do things the hard way before finally getting it working successully, and this did not stray too far from my past experiences.

This is the design that I had, many orchestrations would be running, I would have an orchestration that would be picking up all of the messages created by the HIPAA to BAM orchestrations, as soon as I quit receiving the messages, I would make sure that the same number of rows in BAM matched the same number that I picked up. Once everything matched, I would extract the data. I have to check the number of rows against what I picked up because with BufferedEventStream, messages are sent to the MessageBox and inserted when resources are availble, not directly like DirectEventStream. So I could get the last message from the HIPAA to BAM orchestration before the last row is inserted. Below the vision I had:

 

This is where it got fun!

After using Kevin Lam’s blog as a guide, I implemented forward partner direct binding.

I have cre
ated a simple prototype on implementing the forward partner direct binding approach. The first orchestration consumes all messages from a particular port. The sample message looks like this:

<ns0:Root SenderId=”123456xmlns:ns0=”http://PartnerPortExample.Input>

  <Data>

      <Information>Information</Information>

  </Data>

</ns0:Root>

It would then create create a message that just had the SenderId to be sent to the Singleton Orchestration that it would correlate on and pick up all messages for that SenderId.

<ns0:Root SenderId=”123456xmlns:ns0=”http://PartnerPortExample.Status />

I promoted the SenderId in the http://PartnerPortExample.Status message. One key thing to take away is that the property item needs to be a MessageDataPropertyBase. If it is a MessageContextBase it will not work. If the promoted field is a context property, the the subscription engine cannot match the message the the HIPAA to BAM orchestration to the Singleton Orchestration, it will state, that no matching subscription could be found.

I then set outgoing port on the HIPAA to BAM orchestration to Direct, and chose the Singleton orchestration as the partner port. In the Singleton orchestration, I set the partner port to itself.

I also set up the correlation set to drive off of the SenderId.

Below are some screen shots of the prototype:

Here is the Process orchestration that takes the original file, and extracts the SenderId into the Status message, some things to notice is that the binding on the InternalPort is set to Direct and the Partner Orchestration Port is set to the Singleton orchestration.

The code in the Assign Promotion Message Assignment shape is the following:

StatusMessage(PartnerPortExample.id)=ExternalMessage.SenderId;

 

Here is the Singleton Orchestration that loops thru capturing all of the messages that have the PartnerPortExample.id correlation set.

It then creates an message recording the number of files it processed an sends the following message with the SenderId as the filename:

<ns0:Root Count=”95xmlns:ns0=”http://PartnerPortExample.Result />

Here is the code in the message assignment shape:

 

TempXML=new System.Xml.XmlDocument();
TempXML.LoadXml(“<ns0:Root Count=\””+System.Convert.ToString(Count)+”\” xmlns:ns0=\”http://PartnerPortExample.Result\” />”);
ResultMessage=TempXML;
ResultMessage(FILE.ReceivedFileName)=StatusMessage(PartnerPortExample.id);

 

I want to thank Jeff Davis, Keith Lim, Kevin Lam, and Adrian Hamza on helping me determine that you cannot have context properties be the correlation set on partner ports. 

Through my ‘contact me’ page,let me know if you would like to get a copy of my prototype.

Dec 072006
 

Lets just say that there is a set of published companion documents that a client has distributed to a client. In there, the list of valid values are even more restritive than what is published publicly. These codesets are valid only for this client, and additional codesets are valid for another client.

The ability to make partner specific schemas is possible! Yes, I know, this is what we all have been losing sleep over!

If you really want to make specific schemas it is possible. This works on both the v3.0 and v3.3.

It is pretty straight foward:

  1. Make sure that you have the party defined
  2. The party has to be defined in a recieve location, if it is not defined, the schema will not show it as an available Partner URI in the schema.
  3. In the schema that you want to be specifically defined for, click on the root node of the schema, in the properties there is a Partner URI drop down list.
  4. Choose the partner you wish to make the customization for from the drop down list.
  5. Change the target namespace to make it unique for that partner
  6. Make your modifications to the schema.
  7. Validate the schema so the customization is uploaded to the database.
  8. Deploy

Note: If you do not see the Partner URI in the list even though you have defined the recieve location (possibly with a binding file), re define the recieve location by changing the address to another partner, and then assign it correctly again, restart the HIPAA service, and it should show up.

You now have a specific partner schema that BizTalk will parse depending on the party definition you have defined. There you can make your specific mapping for that client. In my case I am able to have the accelerator parse the client’s file using their additionally restrive schema definition, and the subsequently map it to the standard schema wherein it then goes into the universal mapping that has been developed for all of the clients.

Actually this functionality exists in both the HIPAA accelerator and the Base EDI adapter. From ages past I worked on the Covast Accelerator, but I can’t remember if that functionality is present there.

Not too bad.

Dec 062006
 

On our development machine, we do not have Office installed, so on my local workstation I copied the BAM.xls spreadsheet to create my activities. When I opened the workbook, I recieved the following error: Error in creating BAM Menu when clicking on theh Details button, it states

Error Description:

ActiveX component can’t create object

Error Source”

XmlParser:subLoadBamXML

I installed msxml 4.0 SP2 on my local machine, and now BAM.xls loads fine. You can download it here.

Dec 012006
 

I am always having to modify the standard queries in HAT, so I finally broke down and created a new query in HAT.

This shows me the last 100 orchestrations run.

SELECT top 100
[Service/Name], [Service/Type],
[ServiceInstance/State],
dateadd(minute, @UtcOffsetMin, [ServiceInstance/StartTime]) as [StartTime], — can’t use ‘as [ServiceInstance/StartTime]’ since this prevents SQL from using index on that column (conflicts with ORDER BY)
dateadd(minute, @UtcOffsetMin, [ServiceInstance/EndTime]) as [EndTime], — can’t use ‘as [ServiceInstance/EndTime]’ since this prevents SQL from using index on that column (conflicts with ORDER BY)
[ServiceInstance/Duration],
[ServiceInstance/ExitCode],
[ServiceInstance/ErrorInfo],
[ServiceInstance/Host],
[Service/AssemblyName],
[ServiceInstance/InstanceID],
[ServiceInstance/ActivityID],
[Service/ServiceGUID],
[Service/ServiceClassGUID]
FROM dbo.dtav_ServiceFacts sf WITH (READPAST)
where [Service/Type] like ‘%Orchestration%’
ORDER BY sf.[ServiceInstance/StartTime] desc

 

The only other thing that you need to do is near the top of the trq file, you need to replace the title with:

<anyType _locID=”1″ xsi:type=”xsd:string”>Most recent 100 Orchestrations</anyType>

Save the file back in the Tracking folder as Top100Orchestrations.trq, restart HAT, and you are good to go.