""

Technical

Introduction to CMC Monitoring in SAP BO4 SP4

A few months ago a client asked me about how to be aware about problems (disconnections) in the different servers and services of their SAP BO4 SP4 without needing a person to constantly check the status.  I then started to research about the monitoring feature in the CMC. The truth is that there isn't too much information about it in the documentation, just some pages in the Platform Administrator Guide, or posts or references to this topic around the web. With this post, I will try to compile all this information I have found and will explain the use of these interesting features and give you a brief introduction to CMC Monitoring.

First of all, let me explain what is CMC monitoring: Monitoring is a new feature of the CMC that appears with SAP BusinessObjects 4 and allows Administrators to see the use and health of the BO system. For this, it uses watches, probes, metrics and alerts. There are some default ones, which allow the Administrator to start playing around with monitoring without a huge knowledge on the topic and obtain really useful information about the status of the system. Before using  this feature we need to know the actions we must perform and the existing limitations.

Also, we must ensure the information for monitoring is stored in the Monitoring database. This is a Derby database and it works independent from the other two DBs in SAP BusinessObjects. To activate the Monitoring database we need to log into the CMC, and go to Applications, and double click on the Monitoring Application. This will open the Properties window of the Monitoring Application. Make sure that “Enable Monitoring Application” is checked. Click Save and Close. The final step is to restart the Adaptive Processing Server (APS) if it was not checked. The information in the DB is deleted every time we reach 1GB of information. We can increase that size in the CMC.

Monitoring

To reach the monitoring section in the CMC, we need to go to Applications > Monitoring

Monitoring default view
Monitoring default view

As we can see in the above image, there are 5 different tabs in this section: Dashboards, Metrics, Watchlists, Probes and Alerts. Let's run through each one:

1. Dashboards

it is the default tab and as the name says it is a dashboard which is divided in 5 parts:

  • Overall Health: It is an icon showing if the BO is working correctly or not.
  • Recent Alerts: Here we can see the alerts triggered the last 24 hours by the system.
  • Graphical view of BI Landscape: this panel includes a graphical or tabular view of the Watches within the system .We can drill down click in them.
  • KPIs Status: This part displays three Key Performance Indicators for the system. We also can drill down clicking on them.
  • Deployment Level Metrics: we can see 3 interesting metrics as the running jobs, the users and pending jobs.

2. Metrics

Metrics are individual pieces of information that contribute to know the health of the different components of the system.

Metrics image
Metrics image

Metrics are used to measure the health of a component. The metrics we define will depend on the components we are monitoring and our requirements. Examples of metrics are user login time, query execution time, CPU usage percentage, availability status for a system service, etc. We can use them in Watches and Alerts, setting thresholds for Warning and Danger alerts. There are around 250 predefined metrics. Furthermore, we can create our own metrics if we think they are necessary. Normally, with the default ones, we can check the status of almost everything in the system.

If we select a metric, we can observe the historical and current status of it.

3. Watchlists

This tab lets us monitoring the watches that we have created. The watches are thresholds for metrics that show us if the metric is on a healthy, warning or in danger status.

Watchlists image
Watchlists image

Again, we have created watches by default. We can change these thresholds to adapt them to our system. Also, we can create notifications to the watches, hence, we can ask the watch to send an email if the system is in danger, for example. There are a lot of interesting options with the watches. I encourage you to try them by yourselves.

Finally, we can decide which are the watches that we want to see in the dashboard tab.

4. Probes

They provide us the ability to monitor our SAP BusinessObjects system, using simulated application work-flows which are run through SDK-based scripts.

Probes image
Probes image

This is the most interesting part of the CMC monitoring but,  at the same time, the most difficult. As we can create probes with a SDK, the power of this tab is huge. We could create whatever we want, but it is not very easy to create these probes. There are not examples around the net or they are very difficult to find. (I will try to add another entry in this blog in the future with an example).

As I mentioned before, the system will provide default probes. We can choose among 9, which are:

  • CMS Logon Logoff probe: Monitors if a user can successfully log on to the SAP BusinessObjects system and how long does it take.
  • Crystal Reports service through Page and Cache Server probe:  Monitors the availability and performance of the Crystal Reports service (through the page and cache servers).
  • Crystal Reports service through Report Application Server probe: Monitors the availability and performance of the Crystal Reports service (through RAS).
  • Infoview Probe: Monitors the availability and performance of logon/logoff the InfoView web application.
  • Web Intelligence Service Monitoring probe: Tests the availability and performance of the Web Intelligence Service through the report servers.
  • CMS Ping probe: Tests the core functionality of the CMS.
  • CMS Cache probe: Tests the availability of the CMS cache.
  • CMS Database Connection probe: Tests the connection to the CMS repository.
  • Start and Stop Servers probe: Tests to start and stop the different servers of the system.

5. Alerts

Here is where all the Alerts that we have set up in the Watchlist will appear.

Alerts image
Alerts image

This is a list of the alerts that the system raises. We can see the different details of the alerts when clicking on them. These alerts also can send emails and perform actions, depending on what we have set in the watchlist.

How to use all in one

With all the above information, I have explained an overview of the different tabs of the monitoring and also what we can do with each one of them.

If we want to tap the full potential, it is easy to see that all the tabs are connected and we need to use them all. One nice example could be monitoring the response time to open a report in the system. For this, we use the probe "Web Intelligence Service Monitoring" that logs in and we try to open and refresh a report in Web Intelligence. After this, we use the metric of the time that takes to the system to execute the probe and the metric that says if the probe was successful or not. Also, we create a whatchlist with this metric and if the probes takes more than 5 seconds, it will send an email advising the Administrator that the system is working a bit slow and if the probe fails or takes more than 15 seconds, it will send another email advising that the system is not working properly. We can schedule this probe to be executed on a daily basis.

With this simple scenario we are able to detect any problems in the system automatically, before any user reports it.  Moreover, the tabs' alerts will show us if there is something wrong that we had not noticed before.

Conclusion

As a conclusion, we can see that by playing around with all of these features, we can accomplish really interesting things and get most everything under control in our BO system. It could appear difficult to learn, but there are a lot of things that we can do just with the default settings. After you get used to them, it is time to play with the power of the self-created probes and add more personalized options to our monitoring.

I hope this introduction to monitoring has helped you to understand a bit more how CMC Monitoring works in SAP BO 4 SP4.

If you have any questions or suggestions, please leave a comment below.

Using a Java Application as a Source of information with SAP Data Services

After my post on how to Use Data Services SDK libraries to construct an AWTableMetadata in a Java application, in this post I will explain how to access a JAVA application as a source of information by using the WebService DataStore in SAP Data Services.

To do this we need:

  • Tomcat Server (where our service will be allocated Tomcat 6 BI4)
  • Axis2 Library (1.6.2)
    • Standard Binary distribution/zip
    • WAR distribution/zip
  • Eclipse (To construct our web Service)
  • Data Services Sp3

 

1. Building the web Service

Step 1: Unzip the the Axis2 WAR distribution inside of the folder…SAP BusinessObjectsTomcat6webapps replacing the actual Axis2 folder by default.

Step 2: Unzip the binary distribution of the Axis2 into a folder called Axis2 (C:Axis2axis2-1.6.2).

Step 3: Set system AXIS2_HOME variable. Go to Control panel -> System -> Advance System settings-> push the Environment Variables button and create the new one with this specification:

Step 4: Set up CATALINA_HOME variable. This is where the Apache Tomcat folder will be located. Go to Control panel -> System -> Advance System settings-> push the Environment Variables button and create the new one with this specification:

Step 5: Restart the Tomcat 6 Server. (turning off our BI4 installment during this process is recommended)

Step 6: Inside Eclipse right click to export our java application into a JAR File.

Step 7: Set the Export destination to the follow path:

 …SAPBusinessObjectsTomcat6webappsaxis2WEBINFservicesOURPROJECT.aar

 Where “OURPROJECT.aar” will be the name of our web service with .aar extension.

Step 8: Restart the Tomcat. Once Tomcat has started we have to access to this path http://localhost:8080/axis2/services and we should see this window:

Step 9: Click on our service in this case Clariba_SMT. We will redirect to the following URL:

http://localhost:8080/axis2/services/Clariba_SMT?wsdl

Copy this URL into a notepad for setting the Web Service Data Store.


2. Setting the Web Services Data Store in Data Services

Step 1: Open the Data Store perspective and right click with the mouse and select New.

Step 2: Set the new name of our extractor as “WS_ClaribaSMT” and then put the URL where our web service is located. Data integrator needs the WSDL descriptor so we put http://localhost:8080/axis2/services/Clariba_SMT?wsdl

Step 3: Import the functions from the webservice that we are going to use. We select in this example“getTableTweeetsEN” for English and “getTableTweeetsES” for Spanish language.

 

Conclusion: Now we are ready to use our Web service as a Function Call inside a transformation in a Data Services Data Flow. Keep in mind that our Java Application has to return a type of value that Data Services can interpret easily for example a Table (See my past blog for more information). Subsequently we have to do a special set of transformations to interpret the information of the Web Services, which I will cover on my next post.

If you have any related tips or suggestions, please leave a comment below.

Providing SSM data to SAP BusinessObjects reporting tools - Exhausting all scenarios

In the last two months I have been in a couple of customers who were interested in the integration of SAP Strategy Management 10.0 with SAP BusinessObjects BI Platform 4.0. In this post I would like to explain what that integration is about.

First of all, a brief introduction about the solution. The SAP Strategy Management (SSM) software allows aligning the Strategy Plan of the company and its key objectives and spread it across all the organization. It is an out of the box Enterprise Performance Management solution in which you can insert the most important KPIs of your company in Balanced Scorecards or Strategy Maps and control their performance in relation to the Strategy of the company. Within SSM you can also create initiatives to implement improvements or corrective actions and link them to the objectives or KPIs.

Sometimes the customer has requirements to implement reports or further analysis on top of SSM and these cannot be covered with the standard functionality of the solution. In that case, the best option is to implement the integration between SSM and SAP BusinessObects reporting tools. Some examples of what you can do with that integration:

  • Implement bespoke Dashboards with your SSM KPIs and Objectives
  • Implement universes and allow users to exploit information from SSM with Web Intelligence without consuming additional SSM licenses
  • Implement pixel-perfect Crystal Reports on top of the SSM
  • Foster ownership with Publications. For example, send a list of KPI’s that are not performing well to their responsible users

For implementing such integration, you can follow the SAP SSM Configuration guide. However, in some cases, there is lack of documentation and you have to do things outside the script. In other cases, the existing documentation has not been updated yet to BI 4.0 and SSM 10.0. And finally, sometimes the documentation is wrong or the software has bugs and you cannot setup the integration.

In this post we will analyze following scenarios:

  1. Web Intelligence reporting on top of a SSM Models
  2. Web Intelligence reporting on top of a SSM Data Model (Clariba-developed solution)
  3. Crystal Reports on top of SSM (exploring different options)
  4. Dashboards on top of SSM (exploring different options)

These scenarios have been implemented with following software components:

  • SAP NetWeaver 7.3 SP08
  • SAP SSM 10.0 SP06
  • SAP BusinessObjects BI Platform 4.0 SP05
  • Crystal Reports 2011 SP05
  • Dashboards 4.0 SP05

1. Web Intelligence reporting on top of a SSM Model

As per SAP documentation, we can setup the ODBO Provider in order to build a Universe on top of SSM Models. The problem is we still have to use the Universe Designer instead of the Information Design Tool of the BI 4.0. Bellow are the steps for setting up the ODBO Provider and implementing your first report on top of the SSM:

  • Go to your BO 4.0 server and make sure you have a Multimensional Analysis Service in the BO server. Stop the MDAS Server and the Connection Server in the Central Configuration Manager
  • Copy the ODBOProvider folder from the <drive>:Program files (x86)SAP BusinessObjectsStrategy ManagementInternetPub path of your SSM server to the BO server
  • In the SSM server, run the SSMProviderReg.bat file in the BO server. Make sure you have administrator rights in the BO server. Once completed, check in the regedit that you are able to find the register SSMProvider.1 in the Windows register
  • Modify the windows register to insert following string in that path (assuming that you are using a 64-bit Windows): HKEY_LOCAL_MACHINESOFTWAREWow6432NodeSAPSSMODBOProvider "servletUri"="/strategyServer/ODBOProviderServlet"
  • Access the following path in BO server: <drive>:Program Files (x86)SAP BusinessObjectsSAP BusinessObjects Enterprise XI 4.0dataAccessconnectionServeroledb_olap

<DataBase Active="Yes" Name="Strategy Management 10.0">

<Aliases>

<Alias>Strategy Management 10.0</Alias>

</Aliases>

<Library>dbd_sqlsrvas</Library>

<Parameter Name="Family">SAP BusinessObjects</Parameter>

<Parameter Name="Extensions">sqlsrv_as2005,sqlsrv_as,oledb_olap</Parameter>

<Parameter Name="MSOlap CLSID">SSMProvider.1</Parameter>

</DataBase>

  • Start the MDAS Server and the Connection Server
  • Now we can go to the Universe Designer and start implementing a Universe on top of our Model. The first step is defining the connection. In the connection list (retrieved from the Connection Server) we can see now the new register we have inserted, Strategy Management 10.0:
  • Define the connection parameters. You must be an SSM user with proper permissions and you must inform the server's complete address (FQDN) and its port.
  • Once connected to the SSM server, you will see the list of available Cubes. The AS category allows access to the measures (based on attributes and dimensions) in the Application Server model.The SM Adapter allows access to the strategy dimension, which represents the strategy management dimensions Scorecard and Initiative. Scorecard detail not relating to the KPI such as comments are not presented.
  • And finally we will see our universe with the available dimensions, the standard classes (Time and Scorecards) and the measures. You can display the technical names of the objects as detail or you can define hierarchies of Perspectives, Objectives and KPIs.
  • Now we can publish the universe and go to the Web Intelligence to start implementing our reports on top of the SSM models.

2. Web Intelligence reporting on top of SSM Data Model

That is a solution you can implement if you have an advanced knowledge of the SSM Data Model. You can implement a UNX Universe with the Information Design Tool by linking all the tables of the Entry and Approval, the Nodes of the Scorecard, the Cube Builder or the Initiatives. The advantages of that option is that you have access to extra information not available in the SSM Cube,  such as the Initiatives, the users related to specific KPIs, the attributes of the KPIs etc.

If you are interested in such an option, please, contact us. Keep in mind that it is not a supported option from SAP but we have implemented it many times and we know it works.

3. Crystal Reports on top of SSM

The integration of SAP Crystal Reports with SSM can be done by 4 different ways:

  • Implementing a Query as a Web Service from the Universe, built in previous steps: we have managed to implement that scenario
  • Connecting Crystal Reports to the Universe on top of the SSM Models we have built in the first step: that scenario is not working although we followed the instructions from SAP
  • Using an OLE DB (ADO) connection: this option, not explained by SAP, is available if we install the ODBO provider but we have not managed to make it work
  • Using the OLAP Connection: according to SAP Documentation, we can build an OLAP Cube Report in Crystal Reports. We should be able to select the Strategy Management option in following screen to inserts SSM connection data, but we have not been able to find that option

4. Dashboards using Web Services Connections

Using Dashboards, you have two different ways to implement the access to data:

  • Implement a Query as a Web Service (QaaWS) to retrieve the relevant information from the Universes we had implemented before.
  • Use Web Services available in the SAP NetWeaver to retrieve information from the SSM:  according SAP documentation you need to download the WSDL file SMDataServiceService and CubeServiceService applications and call the functions within them. With our Dashboards 4.0 SP05 we have not been able to process the WSDL files as the tool is unable to load the URL.

Summary

Providing SAP Strategy Management information to SAP BusinessObjects BI Platform 4.0 can enhance the capabilities of your Strategy system. However that integration is not so easy given the lack of information on that topic and the quality of the existing information. We tried to implement all possible integration scenarios and we have succeeded with Web Intelligence and this is the route we recommend  as the scenarios related to Crystal Reports and Dashboards the integration were not working when using SSM 10.0 and BI 4.0.

We will be following-up these issues and let you know if we finally manage to solve them. If you have any suggestions or if you found a workaround to these issues please leave a comment bellow.

Using Google Chrome with SAP BusinessObjects

We all know that there are many internet browsers available, but definitely Google Chrome is one of the most used nowadays and therefore we have had a lot of feedback from our customers related to using chrome with SAP BusinessObjects.

A main problem found by users is that when using Google Chrome on InfoView or BiLaunchPad a missing plug-in error screen appears when you are trying to modify a report and it also shows a HTTP Status 500 error screen when trying to log in to SAP BusinessObjects Explorer. In this blog I will provide a solution these issues.

Issue when login to InfoView or BILaunchPad in GoogleChrome
Issue when login to InfoView or BILaunchPad in GoogleChrome
Issue when accessing SAP BusinessObjects Explorer
Issue when accessing SAP BusinessObjects Explorer

The solution we have found is to use a Google Chrome add-on called “IE tab”, which emulates Internet Explorer on Chrome.

Steps to install it:

  1. From Google Chrome, introduce the link in the url bar and install the add-on.                                                                                       https://chrome.google.com/webstore/detail/hehijbfgiekmjfkfjpbkbammjbdenadd
  1. Once installed you will see a small folder with the IExplorer logo on the top right corner of google chrome.
Folder with Explorer logo
Folder with Explorer logo
  1. Click on the folder and another url bar will appear
URL bar appears
URL bar appears
  1. Introduce the Infoview / BI LaunchPad link in the new bar and start working with it.

Moreover, in case you need to click on an Open Document link and you want Google Chrome to be opened up automatically with it you will need to add your server’s url so it can open successfully, for this you need to follow these steps:

  1. Right click on the small folder with the IExplorer logo
  1. Choose Options
  1. Add the server’s address text in the Auto URL’s field

The outcome

SAP BusinessObjects web based applications can be successfully used with Google Chrome, see below the following examples:

Modifying a report on InfoView
Modifying a report on InfoView
Viewing and managing spaces from SAP BusinessObjects Explorer
Viewing and managing spaces from SAP BusinessObjects Explorer

Summary

The main benefits of applying these tips are, with the help of Google Chrome:

  • Modify documents on Infoview / BILaunchPad
  • Log in and manage spaces in SAP BusinessObjects Explorer
  • Increase the speed of navigation while going through the platform folders

According to the SAP’s official PAM (Product Availability Matrix), Google Chrome is not entirely supported due to its fast development speed, so the use and frequent update of this add on is highly recommended.

Hope this will help you have a better experience when working with SAP BusinessObjects. If any doubts or suggestions please leave a comment below.

How to configure SQL Server connectivity for WebI from SAP BusinessObjects BI4.0 in Linux

Nowadays we have noticed some of our customers are following the trend of open source products. Indeed, Linux is a great choice of operating system due to the fact it is totally compatible with SAP BusinessObjects BI 4 and it also help companies to cut costs.  However, Linux has retained the way the classical Unix operating system works and therefore everything is about rights and batch commands. Therefore an advanced Linux technical know-how is compulsory before getting into it.

The purpose of this blog entry is to share the issues we faced in one of our customers running SAP BusinessObjects BI4 SP4 in a Red Hat Enterprise Linux Server release 6.3 using MySQL 5.1.61 as the system database and how we solved them.

The issue came out when right after a production database migration (a brand new SQL Server 2008) all their WebI documents stopped running from the SAP BI4 Launchpad with an unusual error "Database Error .[ (IES 10901)" blocking every single WebI to run and the whole core business was jeopardized. Rich Client did not experience any problem in Windows. After the first analysis, we discovered that default SQL Server ODBC driver installation was only configured properly for 32bit connections in the Linux server whereas WebI requires 64bit ODBC driver connectivity for running in the SAP BI4 Launchpad.

When it came to this point we had to apply a couple of OSS notes. The first one was OSS 1607125 "How to configure SQL Server connectivity for WebI from a BI4.0 unix environment". Resolution is:

1. Open env.sh under <install directory>/sap_bobj/setup/

2. Search for the following line

LIBRARYPATH="$LIBDIR:$LIBDIR32:$WCSCOMPONENTDIR:$PLUGINDIST/auth/secEnterprise:${CRPEPATH64}:${CRPEPATH}:${MWHOME}:$PLUGINDIST/desktop/CrystalEnterprise.Report:${BOBJEDIR}enterprise_xi40/$SOFTWAREPATH32/ras:${BOBJEDIR}mysql/lib”

3. Modify the line above by adding the following

":${BOBJEDIR}enterprise_xi40/linux_x64/odbc/lib:${BOBJEDIR}enterprise_xi40/$SOFTWAREPATH32/odbc/lib"

The line should look like this

LIBRARYPATH="$LIBDIR:$LIBDIR32:$WCSCOMPONENTDIR:$PLUGINDIST/auth/secEnterprise:${CRPEPATH64}:${CRPEPATH}:${MWHOME}:$PLUGINDIST/desktop/CrystalEnterprise.Report:${BOBJEDIR}enterprise_xi40/$SOFTWAREPATH32/ras:${BOBJEDIR}mysql/lib:${BOBJEDIR}enterprise_xi40/linux_x64/odbc/lib:${BOBJEDIR}enterprise_xi40/$SOFTWAREPATH32/odbc/lib”

4. Navigate to <install directory>sap_bobjenterprise_Xi40

5. Open odbc.ini file using vi or other text editor tools.

6. Find the entry for Sql Server DSN.  The default DSN entry in the odbc.ini is called "[SQL Server Native Wire Protocol]" but it's recommended that you create your own DSN entry using the same parameters specified in the default DSN.

7. Update the "Driver" section of the DSN to point to 64 bit version of SQL Server ODBC drivers

Driver=<install directory>/sap_bobj/enterprise_xi40/linux_x64/odbc/lib/CRsqls24.so

8. Restart the SIA

However the issue was not resolved completely. We received a new error with the following description whenever we tried to run a WebI "Receive the error : Database error: [DataDirect][ODBC lib] System information file not found. Please check the ODBCINI environment  variable.. (IES 10901) (WIS 10901)". This is a configuration issue on the Linux operating system with the environment variable ODBCINI.  Please make sure your environment variables are set correctly according to OSS note 1291142 - "Web Intelligence reporting using DataDirect drivers in Unix" (as of today it still applies to BI4). Resolution is:

1. In the Bobje user's Unix profile, add/modify the following environment variables and source the profile

BOBJEDIR=<install_path>/bobje export BOBJEDIR ODBC_HOME=$BOBJEDIR/enterprise120/<platform>/odbc export ODBC_HOME ODBCINI=$BOBJEDIR/odbc.ini export ODBCINI LD_LIBRARY_PATH=$BOBJEDIR/enterprise120/<platform>/dataAccess/RDBMS/connectionServer:$       ODBC_HOME/lib:$BOBJEDIR/enterprise120/<platform>/:$LD_LIBRARY_PATH export LD_LIBRARY_PATH

NOTE: For AIX replace LD_LIBRARY_PATH with LIBPATH, For HP-UX use SHLIB_PATH NOTE: Replace <platform> with linux_x86, solaris_sparc, aix_rs6000, hpux_pa-risc depending on your specific Linux platform. NOTE: You must set/export the above env variables in the same order as shown.

Please make sure to use the file $HOME/.odbc.ini as your default source for ODBC settings. Therefore, modify the ODBCINI variable in the following way:

ODBCINI=$HOME/.odbc.ini export ODBCINI

2. Modify the odbc.ini to add the DSN

                  [TestDSN] Driver=<install_path>/enterprise120/<platform>/odbc/lib/CRmsss23.so Description=DataDirect 5.3 SQLServer Wire Protocol Driver Address=<sql_server host or ip>, <port> Database=<db_name> QuotedId=Yes AnsiNPW=No

NOTE: Your DSN name (TestDSN) must be the same DSN name you used when creating the ODBC connection in Windows

3. DataDirect provides both NON-OEM drivers and OEM drivers

The drivers provided by BI4 are OEM drivers. Basically the WebI is dependent on the ConnectionServer.  By default the ConnectionServer is set to use NON-OEM drivers. Thus, we edited the connection server to allow the use of the OEM branded DD driver. The steps are:

  • Make a backup copy of $BOBJEDIR/enterprise120/<platform>/dataAccess/RDBMS/connectionServer/odbc/odbc.sbo
  • Open odbc.sbo with VI, search for DataDirect, there are 4 entries one for each MSSQL server we support.
  • Change all 4 from No to Yes <Parameter Name="Use DataDirect OEM Driver" Platform="Unix">Yes</Parameter>

 4. Stop all XI servers

Run ./stopservers, log out completely from your unix shell and log back in (to make sure new environment variables are setup), start all BI4 servers again.

After applying the OSS note we were able to retrieve data from SQL Server 2008 refreshing our WebI documents, however we noticed that CPU was reaching 100% every time we used a WebI in any way. Going through the log files we found errors such as "MS SQL Server 2008 |JobId:61340512 |EXIT SQLGetDiagRec with return code -1 (SQL_ERROR)" .

We took a look at the odbc.ini file and we found out that QWESD entry that was not initially there somehow appeared. As long as we were copying the information from an existing datasource we didn't need it all and we decided to remove the QEWSD=<random string> from the ini file.

Finally double check that <Parameter Name="Use DataDirect OEM Driver" Platform="Unix">Yes</Parameter> located at sqlsrv.sbo file in /opt/bi40/sap_bobj/enterprise_xi40/dataAccess/connectionServer/odbc is set to Yes.

We hope that our experience is a rapid problem solving approach for you. If you have any tips or suggestions to improve this article, please leave a comment below.

Managing ETL dependencies with BusinessObjects Data Services (Part 1)

Are you satisfied with the way you currently manage the dependencies in your ETL? Dependencies between jobs (or parts of jobs) are an important aspect of the ETL management. It pertains to questions like: Do you want to execute job B if job A failed? Imagine that you have a job C with sub-job 1 (usual runtime: 3 hours) and sub-job 2 (usual runtime: 2 minutes). If sub-job 1 was successful and sub-job 2 failed, can you gracefully restart job C without the sub-job 1 being restarted again?

As soon as you have more than 1 simple job, you have to manage your dependencies. In this article (part 1 of a series of articles about ETL Dependencies Management) I’ll first list some of the characteristics I’m looking for in an ideal dependency management system. I will then have a look at some of the possibilities offered by SAP Data Services 4. In part 2 (my next post), I will propose the architecture of a possible dependency management system. In part 3, I will go into the details of the implementation in Data Services. I’ll finish with part 4 by telling you about how the implementation went, and if some improvements are possible.

The ideal dependency management system

In this post I will use the word “process” to design a series of ETL operations that have a meaning together. Example: extract a source table, create a dimension, or update a fact table. The objective here is to manage the dependencies between the processes: updating a fact table should probably only be allowed if updating the corresponding dimensions was successful.

A dependency management system should ideally have at least the following characteristics:

  • Run a process only if its prerequisites ran correctly
  • After a failure, offer the option to re-run all the processes or only the processes which failed
  • Trace the outcome of each process (ran successfully, failed, did not run)
  • Run dependent processes dynamically (rather than statically, i.e. based on date/time)

The possibilities

Let’s enumerate some of the possibilities offered by Data Services, with their respective pros and cons.

1) One job with all processes inside. This is very easy to implement, dynamic in terms of run times, but it doesn’t allow for concurrent runs. Most importantly, it means that failures have to be managed so that the failure of one process does not stop the whole job.

2) One process per job, with jobs scheduled at specific times. This is very easy to implement, allows concurrent runs, but is not dynamic enough. If the process durations increase with the months/years, jobs may overlap.

3) One main job calling other jobs (for example with execution commands or Web Services).

4) One process per job, all the jobs being scheduled at specific times, but checking in a control table if the pre-requisites ran fine. Otherwise they just sleep for some time before checking again.

5) Use the BOE Scheduler to manage jobs based on events (how-to is well described on the SCN). I’ve not tested it yet, but I like this approach.

By default, the first two possibilities only manage the “flow” side of the dependency management (after A, do B). But they do not manage the conditional side of the dependency management (do B only if A was successful). In both cases, a control table updated by SQL scripts would allow the ETL to check if the prerequisite processes have been run correctly.

What I don’t really like in the solutions 2 to 5 is the fact that it’s difficult to have an overview of what’s going on. You cannot really navigate within the whole ETL easily. The solution 1 gives you this overview, but at the cost of having a potentially huge job (without the possibility of processes running concurrently).

Also note that the solutions with multiple jobs will need to manage the initialization of the global variables.

What I miss in all these solutions is an optimal re-start of the ETL. If 10 of my 50 processes failed, and I want to restart these 10 only, do I really have to start them manually?

In my next blog post I’ll propose an architecture that addresses this optimal restart.

Until then, please let me know your thoughts about how you manage your ETL dependencies. Any of the 5 solutions mentioned before? A mix? Something else? And how well does it work for you.

Use Data Services SDK libraries to construct an AWTableMetadata in a Java application

If you have a Java application that returns a table and you are planning to use this as a source of information for SAP Data Services, the best way is to return a table with the same data type as the Data Services Template table “AWTableMetadata”. I will explain how to easily do that in this article.

First you need to go to the libraries folder inside your SAP BusinessObjects installation (…SAP BusinessObjectsData Serviceslib). From this folder we have to import the following libraries to our Eclipse Java project.

  • Acta_Adapter_sdk.jar
  • Acta_broker_client.jar
  • Acta_Tool.jar

The easiest way is to put these libraries inside your Java ext libraries folder so your application will import it automatically. Also if you’re planning to deploy this application on a server you need to place this library inside the server library folder too.

  • ….Javajdk1.7.0jrelibext
  • …..Javajre7libext
  • …..SAP BusinessObjectsTomcat6lib

Import these libraries inside the project:

Import com.acta.metadata.AWAttribute;

Import com.acta.metadata.AWColumn;

Import com.acta.metadata.AWTableMetadata;

Once we have our libraries imported inside our Java project we have to assign the return value for the function in charge of constructing the table as the same data type for the table.

Public static AWTableMetadata createAWTable () throws Exception {…]

Then we are ready to construct our table. To do so we have to:

  1. Declare the table:
    1. AWTableMetadata awTable = new AWTableMetadata () ;
    2. awTable.setTableName("……");
  2. Assign the rows and columns
    1. AWAttribute [] attributes = new AWAttribute [2000] ;
    2. AWColumn [] columns = new AWColumn [2000] ;
  3. Assign the Attributes and columns to our table:
    1. awTable.setColumns(columns);
    2. awTable.setAttributes(attributes);

Finally we have to make the return statement as “return awTable”.

In conclusion, once we have our function done we will be able to communicate and exchange data with data services through our application in this case with a table and be able to use our application as a Data Source.

If you have any doubts or suggestions, please leave a comment below.

Tuning SAP Rapid Marts XI: Streamline delta loads

You have finished you standard implementation of SAP Rapid Marts XI, everything went fine but your customer start to have issues regarding the time consumption of the delta loads. In this article I will explain a couple of approaches to achieve a better performance on delta loads of SAP Rapid Marts. In the image below we have the typical infrastructure of SAP Rapid Marts, loading into on single data warehouse.

This infrastructure has pros and cons but I will highlight two main advantages:

  • Avoids duplication of information
  • Simplifies maintenance from customer perspective

 

1st Approach: One job runs it all

Taking the architecture illustrated above as our basis, the first step to achieve better performance will be to create one single ETL job to run the different SAP Rapid Marts involved in our implementation.

This task is simple; just create one workflow per SAP Rapid Mart containing all the different workflows that are part of each SAP Rapid Mart. Once this task is done, create an ETL job with all the corresponding global variables, drag and drop all the workflows and connect them to create a sequence of execution.

This job also allows us to take advantage of the “execute only once” option in SAP Data Services. This option is set for all the components in SAP Rapid Marts and it defines that each component within the same ETL job execution is executed only once. If you take into account how many components are shared between different SAP Rapid Marts this approach becomes very interesting.

In addition, this approach allows us to create a strategy of try/catch in the ETL process. Some customer environments can have intermittent issues that can crash the execution of our daily loads (i.e. network errors). We will place try and catch statements for every workflow of the job, then inside the catch statement we will place again the workflow that we were trying to execute, the following image illustrates the idea:

The try/catch + ”Execute only once” strategy allows you to retry the execution of a component of the ETL job and continue the execution where it stopped. 

Once this idea is implemented the execution of your SAP Rapid Marts will be more robust and optimized but maybe not enough to fulfill your customer´s expectations… so let us move on to the second step.

 

2nd Approach: Working around a parallel execution

Analyzing the information of the Performance Reports generated in SAP Data Services Management Console after the execution of a job, you will be able to identify the components with the worst execution times.

These components can vary from one implementation to another depending of your customer´s environment; within the top 10 worst execution times you will find some components generating information of dimensions and/or fact tables of the model. Some of these components can be easily removed from your sequential execution and placed on a separated job to be executed in parallel.

It is critical at this stage to ensure that these components are completely removed from the sequential execution and that any final output of the component is not used in other parts of the ETL process (i.e. subsequent table lookups). To ensure this, the function “Where is used” of the SAP Data Services Designer will be extremely helpful.

In my experience, after applying these two steps we should experience a considerable improvement on execution performance of delta loads. To give you an example, in one of our recent implementations we started with an execution time of 17 hours for five SAP Rapid Marts running sequentially, this was decreased to 6 hours using the two approaches I have described in this post.

 

Digging deeper

If even after applying the previous steps you still face bad performance in isolated components, this situation will require more analysis and customization at lower level.

Some components of the standard SAP Rapid Mart tries to execute on the ERP side some components with complex logic, which can take a very long time (i.e. SAP General Ledger RM + SAP Note 1557975 or SAP Inventory RM + SAP Note 1528553)

In these cases, the workaround is to split the process in several steps and maybe make use of custom tables on the ERP side and performance boost will be remarkable. I can tell you that in our most recent implementation one of the components was taking no less than 12 hours to run but after we analyzed and modified the behavior of the component, to make use of one custom table on the ERP, this component took no more than 30 minutes to run. This process of customization of a component took 2 man days to be completely implemented.

As conclusion, my experience with the SAP Rapid Marts is very positive. SAP provides a rapid deployment solution that can be up and running end-to-end in a few weeks. Furthermore, it provides an extremely easy to use framework to ensure your customer has the ability to develop any level of customization in a few weeks. Overall we are in front of a solution that will allow your customers to create their own data warehouse in weeks instead of months. If we can improve this issue of delta load performance, the solution becomes even more appealing to your customers and it helps to increase satisfaction levels with the tool.

That´s all folks! I hope this article will help you to raise the bar in your SAP Rapid Marts implementations. If you have any doubts feel free to leave a comment below.

Problems Uninstalling Data Services

I have faced a problem recently and I wanted to share the resolution, in case you have to deal with the same topic. I was trying to upgrade a Data Services machine following SAP procedure (this is copying the configuration files uninstall and then install the new version – not very sophisticated as you can see). This wasn´t as simple as I first though.

Problem started after uninstalling the software, the new version refused to install stating that I should first uninstall the previous version. I uninstalled the software again… but Data Services is still there, so uninstalled again, but this time the process failed (makes sense as the software is already uninstalled), so I kept trying… reboot…uninstall… reboot…rename older path name… reboot…you see where this is going…

So, how did I finally solve this?

  1. Start Registry Editor (type regedit in a command window or in the Execute dialog).
  2. Take a backup of the current Registry content. To do this, with the top node of the registry (Computer) selected go to File -> Export and select a name for the backup file.
  3. Delete the Key HKEY_LOCAL_MACHINESOFTWAREBusiness ObjectsSuite 12.0EIM (Suite XX.X may vary).  NOTE: You may want to write down the key KEY_LOCAL_MACHINESOFTWAREBusiness ObjectsSuite 12.0EIMKeycode first as it contains the license code.
  4. Go to HKEY_LOCAL_MACHINESOFTWAREMicrosoftWindows CurrentVersionUninstall and look for a KEY which property DisplayName is "BusinessObjects Data Services". This step is to remove the entry for the software in the Uninstall Window’s dialog.
  5. Finally delete the content of the installation directory (typically: C:Program FilesBusiness ObjectsBusiness Objects Data Services)

Now you can launch the installer and this time it should work.

Hope this may help you if in case you are experiencing the same issue. Leave comments below if you have any doubts or if you would like to add anything.

 

Tip of the day - Table and index size in Oracle

Ever wanted to find out how big are the tables in your data warehouse or in your ETL storage area? Here is a quick tip.

You can get the size of each table belonging to a specific user with the following code:

select sum(bytes)/1048576 Size_MB, segment_name Table_name
from user_extents
where segment_name in (
     select table_name from all_tables
     where owner = 'OWNER_NAME_HERE')
group by segment_name
order by 1 desc;

In order to get the size of the indexes with the corresponding table names (useful when the indexes have system-generated names), we need another query:

select sum(u.bytes)/1048576 Size_MB, u.segment_name index_name, i.table_name
from user_extents u
join all_ind_columns i
     on u.segment_name = i.index_name
     and i.column_position = 1
where i.index_owner = 'OWNER_NAME_HERE'
group by u.segment_name, i.table_name
order by 1 desc;
If you have any doubts or suggestions, leave a comment below.