Pretty often I have to do some SAP RFC calls (BAPIs) in my projects.
To retrieve a result structure you often have to pass an empty element of the same type in.
In BizTalk this is pretty easy with a Map. You just “connect” all the elements in the target schema to an existing root node and the transformation engine will do the rest for you creating the structure.
But when you have to create this structure in code you are pretty doomed creating all that empty arrays line by line.
Recently I got pretty tired of this, writing a web service calling a SAP BAPI.
So I wrote the following little code code snippet that does the work for me in a generic way.
Make ISO (standard or bootable, with custom file system UDF, ISO9660, Joliet or Bridge) Burn ISO (burn ISO image or any other disc image to CD, DVD or Blu-ray Disc) Extract Files and Folders (browse and extract files from various disc image formats) Backup to ISO (copy any CD, DVD or Blu-ray Disc to ISO or BIN image) Convert to ISO or BIN (supports various formats – ISO, CUE, BIN, NRG, MDF, CDI etc.)
Recently I was searching for a cheap but fast and valuable external hard drive for my MacBook Air. In the past I used some external hard drives from Western Digital on my USB 2 port. That was a pretty slow experience compared with the internal storage speed of my MBA. I first found the Seagate GoFlex adapter. Which is pretty affordable in the US (round about 99$ on Amazon) but the model available in Germany is about 200€. And that is the price just for the adapter without any storage.
So I went on searching and found the Buffalo HD-PA500TU3 which is available for about 140€ in Germany. Pretty good price for an external hard disk (500 GB) with Thunderbolt AND(!) USB 3 connector.
As recommended by some customer reviews on Amazon I planned to pimp my drive with an Samsung SSD. So I ordered an Samsung 840 Pro 520 GB SSD too.
The delivery reached me just in time on christmas eve.
Replacing the built in Samsung hard drive of the Buffalo station with the SSD was pretty easy. There are some good instruction videos on the web. I connected the newly built “super drive” to my MBA and started the disk speed test. The test run round about 2 minutes at a very high level speed but suddenly there was a disk disconnect message from Mavericks. Something went wrong. I repeated the tests but same result. After some research I found several guys telling that they had the same problems even on the Seagate GoFlex adapter.
I send the SSD sad back to amazon and ordered the Samsung MZ-7TE750BW 840 EVO Basic with 750 GB (!) and nearly same price as the Pro one with 500 GB.
Two days later the SSD found its’s way to my Buffalo station and everything worked like a charm.
The benchmark results are amazing and I currently host all my virtual machines on that disk.. For round about 540€ a very fast external drive with 750 GB in a “Apple” like design. Hard to beat.
In the following the screenshots of two benchmarks.
Performance benchmark of Buffalo station with the ORIGINAL hard drive:
Performance benchmark of Buffalo station with the Samsung Evo Basic 705 GB SSD drive:
In case you want to set a constant true or false value to a field of type xs:boolean in a BizTalk Map you maybe run in trouble with the default behavior of the BizTalk Map designer.
You can not assign a constant value by using a String Concatenation Functoid because it is another data type and a cast is not done automatically.
Furthermore the BizTalk Map Designer deals incoming true or false values of type bool regularly as indicator that a node will exist in the target document or not.
So the easiest way is clearly the following:
Create a Scripting Functoid in your map, leave the input empty and assign the output to your field of type xs:boolean.
2. Open the properties of the Functoid, go to the “Script Functoid Configuration” tab.
3. Select “Inline C#” as type from the list.
4. Place the following code in the script box.
(in case you want false as value change the code as necessary)
This is about a problem my colleagues and I researched several days until a solution was found.
The Problem: For a big customer we are creating several solutions with large data transfers.
Despite the fact we are using Microsoft BizTalk Server as the main platform for our solutions we prefer SSIS (SQL Server Integration Services) for processing large data in projects (for obvious reasons).
This big customer has a bunch of SAP environments as well as some SAP Business Information Warehouse systems.
So we planned to extract “some data” from one of the SAP BW with a SSIS package using the Microsoft Connector for SAP BI which works pretty fine in general.
But there is one big pain with this component. Depending on the mode you choose, different properties appear which have to be set.
In my case it was the “SAP BW Source” Dataflow Component.
If you choose the mode “E – Extract” in design time you have to set the following values:
OHS destination and RequestID.
For the “OHS destination” you normally have a fixed value, but for the RequestID you will receive a value on runtime. So I can not hard-code the value in there.
The regular way to solve this in SSIS packages is to define an expression which sets the value on runtime depending on a variable.
But in this case Microsoft denied that. You can not set the value.
The Solution: After some days of extensive search I figured out the following. There is a simple solution (which is def. not supported by Microsoft!) which enables you to make a “hidden” property available for expressions. You just have to open your package (which is an XML internally) as a text file (either in Visual Studio or any other text editor) and add/change the following value.
Navigate to the property you want to make “visible” (just search for it) in my case it is “RequestID” and add or change the attribute “expressionType” on the element to “Notify”.
Save the modified package and re-open it in the Visual Studio Package Designer.
Now go ahead and choose “Parameterize” from the context menu of the Data Flow task where the SAP BW Source Component resides in.
Choose the the newly visible property from the list on top and assign a new or already existing variable.
Press OK and you’re done!
That’s it. Normally your attribute changes are kept even if you do some other changes on the package. In case it got lost, just do it again.
Please feel free to leave a comment below about your ideas, thoughts, critics.
Some days ago I had to implement a data transfer for a customer working this way:
A source system (e.g. SAP) provides provides invoices in a relational data fashioned style that looks like that:
So I got two lists of entities. First all invoices (namely the invoice head) below the <Invoices> node and second all invoice positions for all invoices under the <InvoicePositions> node.
But in the target I had to deliver the data in that way:
They want the invoice positions sub ordered below the invoice.
Initial Solution: To achieve this I build up very quickly a BizTalk map with a custom xslt transformation. A very clean xslt which was working fine. Good.
But the problem… with this solution was that I just did some small data tests initially. When starting tests with large real data transports my process got so damn slow that it became a “showstopper” at all.
What happened? With larger sets of data the transformation got slower and slower because the iterations increased by every record dramatically. So if you are a software developer and loop two arrays to become all elements from the first array mapped to the second one you may very well know the problem. The invoices in the source side appear in a average ratio of 1:3 (Invoice : Positions). So for example you have 100 invoices and 300 invoice positions than your xsl processor has to perform 100 x 300 (30.000) rounds to loop all positions for all invoices. Well that sounds still “affordable”.
But lets go ahead increase the number of invoices a bit. Say 1000 invoices. Means: 1000 x 3000 (3.000.000) rounds to go.
In my case the maximum of set of data (full initial data transfer) was 325.000 x 895.000. Which means 290.875.000.000 rounds for mapping data. The transformation (map) run for a bit more than 4 days on my BizTalk server with a CPU usage of nearly 100%. => Total overkill
After some days of reflection I came up with the following…
Solution: First of all I have to say that this solution is pragmatic and working safe and well but in my personal opinion it is not a very clean and straight xsl alike solution.
To solve my performance issues I created embedded custom code in the xslt template.
In my custom code I created the possibility to create in-memory lookup tables (dictionaries) which are of course pretty faster.
With the new template the transformation performed in nearly under 20 seconds!
Here is a sample of this xslt. Feel free to adopt or copy it.
Auch dieses Jahr möchte ich es nicht versäumen ein kurzes Review niederzuschreiben als Reflektion der Ereignisse auf der diesjährigen Community Konferenz NRWConf in Wuppertal.
Bevor ich zu meiner Bewertung komme möchte ich vorab noch einmal explizit den beiden Organisatoren der Konferenz Kostja (Konstantin Klein) und Lenny (Daniel Fisher) danken für Ihre Mühen diese grandiose Konferenz auf die Beine zu stellen und zu organisieren.
Und nun zu meinen Detailwertungen in den einzelnen Bereichen:
Die Anreise fand wie die Jahre zuvor auch per Zug (ICE) statt und war mit knapp etwas über 5 Stunden von München aus eine “angenehme” Reise. Mein Fazit:9 von 10 Punkten
Ankunft & Hotel
Ankunft pünktlich und wie auch im Jahr zuvor im Arcade Hotel in der Innenstadt sehr zentral und unweit des Hbf. Das Hotel ist mit 3 Sternen m.A. unterbewertet. Sauber, freundlich, ruhig, super Frühstück. Mein Fazit:8 von 10 Punkten
Pre Conf. Evening & Speaker Dinner
Am Vorabend der Konferenz war auch in diesem Jahr für mich als Sprecher das Speaker Dinner eingeplant. Das auszeichnete Essen in gediegener Atmosphäre lies einen sehr bald alle “Reisestrapazen” vergessen. Nette Unterhaltungen rundeten den gelungenen Auftakt ab. Mein Fazit: 9 von 10 Punkten
The Conference Day
Die Konferenz fing wie immer pünktlich und ohne großes “Brimborium” an und die ersten Tracks waren im nu am Laufen.
Die Verteilung der Tracks und das Lineup der Themen und Sprecher war dieses Jahr nahezu perfekt. Ich persönlich hatte (fast) keine Planungskonflikte von Sessions im selben Slot. Dickes Lob an das Orga-Team! Die Qualität der Vorträge die ich selbst besucht habe war ausnahmslos sehr gut und ich bin froh so viel Anregung erfahren zu haben.
Mein eigener Vortrag war nach einem verspäteten Start (Session vorher war leider um knapp 15 Min. überzogen) dann doch gut besucht mit sehr interessierten Teilnehmern.
Die angeregten Diskussionen und das super Feedback von vielen auch am Schluss waren eine Bestätigung für mich doch den “richtigen” Nerv getroffen zu haben mit meinem Thema: “Get in touch with Microsoft BizTalk Server” Mein Fazit: 9 von 10 Punkten
Wie auch die Jahre zuvor fand der Ausklang mit vielen Sprechern und Teilnehmern im Cafe Island statt. Ein Location die sich einfach bewährt hat durch ihr grandioses Ambiente, das ausserordentlich gute Essen und die kühlen Getränke. Mein Fazit: 10 von 10 Punkten
Auch diese NRW Conf. 2013 war jede Sekunde und jeden Euro wert!
Sie ist und bleibt das Konferenz Highlight unter den Community Konferenzen.
Nächstes Jahr steht das 10-jährige Bestehen der Konferenz an und ich freue mich schon heute drauf nächstes Jahr wieder dabei sein zu können.
Some days ago I implemented web harvesting functionality for a customer working this way:
A business analysts places a request (trigger) file with some meta data on a network folder and BizTalk catches it up, performs a http post (HTTP Adapter) containing the meta data from the request file.
Easily achieved with the great RawString class from Microsoft.
The result of the post is a html which I caught up with a multi-part message with one body part of type XmlDocument.
Using RawString as type for the result body part as well would only work in case you have a pipeline which sets the message type to RawString in the message context.
Otherwise you receive some error like “Type “” of message is unknown…”
But using XmlDocument isn’t that nice at all too. When you try to access the XmlDocument in the orchestration it is “just in time” parsed and the XmlReader will perform a DTD validation because of the html root element. So it tries to download the DTD from “http://www.w3.org/1999/xhtml” and maybe fails because there is a firewall blocking this.
A web request timeout error message will appear in the log.
So you have the option to strip off the html root element in a custom pipeline (which I didn’t want to do) or you can extend the nice RawString class by a further constructor accepting a XLANGMessage as parameter. The code looks like this:
public RawString(XLANGMessage message, int bodyPartIndex)
if (message == null)
throw new ArgumentNullException("message");
using (var stream = (Stream)message[bodyPartIndex].RetrieveAs(typeof(Stream)))
using (var sr = new StreamReader(stream, true))
internalRepresentation = sr.ReadToEnd();
You can use it in this way in order to get your html from the result message. Afterwards you can easily access the html as string directly or maybe send it to a file.
If that helps feel free to leave a comment below….
Recently I stumbled upon a problem hat has definitely touched me so many times in the last years before, that I decided to write this little blog post about it.
Problem: You extract a value of type xs:date from a BizTalk message and use it for further processing e.g. calling a web service, query a database or something else.
But the date changed. In my case I got a request message with a valuation date of “2013-09-01″.
After extracting the date from the message in an orchestration shape and assigning it to a variable of type System.DateTime I recognized that the value seem to be changed.
I see the value “2013-08-31″.
Reason: Of course it did not. Just the “representation” changed.
BizTalk internally works with UTC. So what appears is the UTC representation of the date-time value.
In my case (GMT+1) the date time value appears as “2013-08-31 23:00:00″
Solution: Just perform the .ToLocalTime() function on the date you extracted from the message. And than you are fine. (normally)