""

SAP BusinessObjects

Installing Data Services 4.0 in a Distributed Environment

Following my first experiences using SAP BusinessObjects platform 4.0, I decided to write this article after spending 3 days to discover how to install Data Services 4.0 in a distributed architecture. Why did I spend so much time to figure this out? There is something different in the new Data Services: SAP has tried to unify the security and the control of the servers using the CMC. With this new feature we will be able to manage Data Services users and services using the CMC instead of using the Data Services Management Console. This has brought slight changes to the installation process of Data Services 4.0 in comparison with other releases.

I was working with a distributed architecture, which means that I was planning to install the SAP BusinessObjects platform (including Live Office, Dashboard Designer 4.0, Client Tools and Explorer) in one server, let’s say “ServerA”, and Data Services 4.0 in a separate server named “ServerB”.

If you have installed older version of Business Objects such as XI R2 or XI3.0 and 3.1, you know that we would install BusinessObjects platform in ServerA and Data Services in the ServerB and then we would see that we don’t have Data Services intragrated with the rest of the platform and probably we would see the same in the CMC that is shown in the image below.

As you can see this image comes from a SAP note that explains how to solve this problem on past releases, but if you installed Data Services 4.0 in a distributed architecture, the error won’t be solved using the solution described in the SAP note 1615646.

Once the scenario is clear let’s start with the process to install Data Services in a distributed architecture. Before starting ensure that BusinessObjects 4.0 platforms and the client tools with their latest service packs and patches are installed in the ServerA.

 

Step 1: Install Data Services 4.0 in ServerA

As in every new software that has to be installed the first step is uncompress (if needed) the file downloaded from SAP service market place in the ServerA. Open the root folder, go to data_unit and run the setup.exe. After that chose the language to use during the installation.

After these “typical” steps the installation program checks for required components. Review the results and decide whether to continue with the installation, or abort and correct any unmet requirements.

In the next three steps you have to review the SAP recommendations, accept the license agreement and finally, fill the gaps with the license key code, name and company. Once the license key is verified you can chose the support language packs you wish to install.

Then it is time to choose the folder, but in this case, the wizard doesn’t let you choose because BO platform was install before so it takes the BO installation folder as a default.

After that you have to configure CMS information. In this case our ServerA is going to be named “MVBOBITEST01”

This is the important step! The next screen invites you to select what you want to install. In ServerA you have to install all (if needed) Data Services’s features apart from Designer, Job Server and Access server. See the image below.

The subsequent screen will ask for merging with an existing configuration. In this case no existing configuration can be reused so the answer is: skip configuration.

Then you can choose if you want to use an existing data base server or you want if you want to skip this part and do it after the installation. Chose what suits you best. Imagine that you or your vendor doesn’t have the database or schemas for the CMS repositories ready when you were planning to install Data Services. In that case, you can configure the CMS parameters after the installation without problems.

After filling in the information for the CMS systems data base and the Audit database (if required), the Metadata Integrator configuration starts. I am not going to describe this step in depth because it hasn’t got an impact on our installation. Furthermore, configuring Metadata Integrator is not  difficult; as always you only have to choose the ports, folders and the name.

Once we have finished installing Metadata integrator and ViewData, the installation will start. After the installation process we can proceed to the next step.

Step 2: Install Data Services 4.0 in ServerB

After a few hours installing the first part of Data Services we are ready to install Data Services in a dedicated server which is going to have the ETL only.

Again it is time to uncompress the file you downloaded before, run the DVD or whatever is the installer you are using. Run the setup.exe.

Once the installation starts, we are going to repeat the same firsts steps mentioned before until we reach the screen to specify CMS information. Add the CMS information related with the ServerA or “MVBOBITEST01”. Why? Because we don’t have CMS installed in the ServerB.

Choose the components that we did not choose before during the BusinessObjects server installation: Job Server, Access server and Designer.  

As we did in the ServerA installation choose skip the configuration if no previous configuration exists.

In the next step you have to configure the account on which you would like to run Data Services. You can choose to run it using a system account or run it with a specific service account previously defined. What is good about using a service account rather than a system account is that if you want to stop a service (like Data Services Job Server) you need the password of this account, because it is not related to the system account with which you log onto the S.O.

Then configure the Access Server. In this case I kept the default values.

After the last step the wizard asks to start the installation. After a couple of hours you will have Data Services 4.0 running.

There is another important point after the installation process that I will be covering on my next article due on the beginning of May:  the repository configuration. One of the new and best features of Data Services 4.0 is the integration with the BO platform using the CMC, which results in a complete integration when you configure the repositories properly.

If you have any questions or other tips, share it with us by leaving a comment below.

How to convert a Universe to Multi-source in SAP BusinessObjects BI4

The new BI 4 offers a very powerful meta-layer capability: a single Universe can sit on top of several data-sources with great benefits offered by a real-time integration. At a first glance, you might think that existing Universes would need to be typed from scratch, but this article explains how to re-use an existing Universe to provide this highly scalable and expandable meta-layer.

The multi-source Universe

A multi-source Universe is now designed as a project with the following components:

  • Connections
  • Data Foundation
  • Business Layer

These items can be created and configured separately, and then be connected one with another. The cycle of creating a new Universe is easy because the connections, data foundation and business layer can be created intuitively and always using the common SQL language, so no need to know native connection peculiarities. Once built, what happens behind the scenes is transparent to the end user and he/she will see that Business Objects will produce a query which takes pieces of information from different sources in real-time.

However, while the creation process is quite simple when generating a new Universe from scratch, it is not so straightforward if we are migrating from a legacy universe. Let’s see why.

UNV to UNX conversion process

In our experience, the three steps to be completed are the following:

  • Legacy Universe (UNV) import: Using the standard migration process the legacy Universe can be inserted into the new BI4 platform. This can be done in a very short time and it has the following quick advantages:
    • Migrated Web Intelligence reports will still sit on top of this legacy meta-layer.
    • Live Office BI4, Crystal 2011 and other client tools can continue to perform as these are still using this format.

But we still cannot use platform modules like Explorer BI4 or Crystal Enterprise, or use the new security model or the new features advantages of the new Information Design platform, so the natural next step is to enable this.

  • New Universe (UNX) conversion: From the Information Design tool we will click on File, “Convert .unv universe” and a new UNX universe is provided, with a project containing the three main items: Connection, Data Foundation and Business Layer. The advantages are the ones we previously stated, but there is one big disadvantage: The automatically generated Data Foundation is mono-source type, so the resulting Universe will not be scalable, and there is no easy way of turning a Data Foundation from mono to multi-source. Therefore this will need to be re-built. The process for re-building the Universe is explained simply in the following step.
  • New Universe (UNX) multi-source conversion:

A new Data Foundation shall be created, following the steps stated below:

  • Define connections
  • Create new Data Foundation
  • Copy and paste items to the new Data Foundation and/or re-type tables and joins using the standard SQL language.

Also the Business Layer needs changes, basically to be re-pointed to the new Data Foundation. The recommended steps are:

  • Re-point the Business layer to the new Data Foundation
  • The calls from the objects to the tables will need to be re-typed using the standard SQL language

A limitation in this stage is that the useful “View Associated Table” feature that showed the table lineage from a certain object has disappeared, so this might become quite a manual work. Opening the Universe Design tool in paral.lel with the Information Design tool to get the lineage might help here.

Once this is done, verify and export this new universe.

As a final step, the WebI reports can now be re-pointed to the new multi-source UNX so they can be enhanced with new alternative data.

Process summary

See in the following diagram a summary of the process:

  • Step 1: Legacy Universe import
  • Step 2: New Universe UNX conversion
  • Step 3: New Universe UNX multi-source conversion

UNV to UNX conversion process summary

UNV to UNX conversion process summary
UNV to UNX conversion process summary

Conclusion

In the short term, in BI4 it should become a common practice to have 3 versions of the same universe:

  • UNV: To preserve the legacy WebI reports and to use certain client tools like Crystal 2011 or Live Office.
  • UNX mono-source: To use certain platform tools like Explorer or Crystal Enterprise and to have a higher level functionality.
  • UNX multi-source: To use certain platform tools like Explorer or Crystal Enterprise, have a higher level functionality and be able to use several sources in one Universe.

Mid-term only this last multi-source version should remain.

Benefits

This Universe conversion method is time-efficient as it reuses all existing folders and objects, and shows tips for a better Universe re-creation.

The multi-source Universe gives superior benefits to the end user, providing a real-time integration and report design simplicity which will make their life easier. It also helps the meta-layer designers who will see their development time reduced with the help of the new design panel functionalities and a common standard language which is easier to understand. Project managers and architects can also consider the fact that they do not have to build a professional Data Warehouse for their projects, and with all this, IT managers will see a quick ROI and lower TCO for their investments.

If you have questions about this method or about the new Information Design Tool in SAP BI4, or if you want to share your experience or tips, please feel free to leave a comment!

How to deploy SAP BusinessObjects 3.1 Web Applications with IBM Websphere

As we all know Tomcat and IIS are the most commonly used tools to deploy web applications (E.g. InfoView, CMC,..) in SAP BusinessObjects and that this deployment can done automatically through BusinessObjects server installation. However, SAP BO allows you to do perform this with other Applications. In this article I will speak about how we can deploy SAP BusinessObjects Web applications using IBM Websphere.

First off all we have to agree that any web deployment apart from those done with Tomcat and IIS should be done manually.

  • Supported Application

SAP BusinessObjects 3.1 is supported by IBM by using Websphere 6 Express Edition or 6 ND Edition.

  • Installation
    • Make sure that the IBM Wepsphere has been installed successfully in the machine and that all the services are up and running.
    • During the SAP BusinessObjects server installation, when you rich the web deployment part DO NOT SELECT any of the options to deploy Tomcat or IIS, just check the box to deploy the web application manually later.
  • Web configuration file
    • The wdeploy configuration file is:

<BO_insall_dir>deploymentconfig.websphere6.

    • Modify the config.websphere6 file (lines to be modified are in Bold).

config.websphere6 file:

# as_dir: the installation directory of the application server

as_dir=C:Program FilesIBMWebSphereAppServer

# as_instance: the application server instance to deploy to

as_instance=server1

# as_virtual_host: the virtual host the applications will be bound to

as_virtual_host=default_host

# as_soap_port: the SOAP administration port to the administration server.

#   If the value is not set (if the line is commented out), the default value is used.

as_soap_port=8880

# as_admin_is_secure (default: false): is security activated in websphere?

#   Security is activated when an user wishing to log into the admin portal has to provide

#   an username and a password. When secutiry is NOT activated, it is not necessary to

#   provide as_admin_username and as_admin_password (the lines can be commented out)

as_admin_is_secure=false

as_admin_username=admin

#as_admin_password=%AS_ADMIN_PASSWORD%

# ws_instance: the web server instance that will serve the requests, in distributed mode

#ws_instance=webserver1 (TO BE USED IF web server is installed in SPLIT mode)

## Don't remove next line

enforce_file_limit=true

 

  • Command used to deploy the applications

To deploy the web application use the command line (CMD) to write the command in the BO server, the command is:

“wdeploy  config.websphere6 deployall”

This will deploy all the BO Web Applications onto the IBM WebSphere server, the process will take about 20 Minutes to deploy.  17 applications are installed.

  • Deploying web application with websphere administration console

Ensure that your WebSphere web application server is installed, configured and running before deploying WAR files.

  1. Log in to the "WebSphere Application Server Administrative" console using the following URL: http://WAS_HOSTNAME:PORT/admin The WebSphere admin console's default port number is 9060. Give a unique name for your web application and proceed to "Step 2".
  2. Under the Applications heading of the console navigation menu, click Enterprise Applications on the left navigational pane. Highlight the server you created (or highlight server1 if you didn't create your own) from the Clusters and Servers and enable the "Select checkbox". Proceed to "Step 3"
  3. Click the Install button and navigate to the location of the WAR file to deploy. If deploying from a remote file system, select the option "Remote File System". Select the virtual host you created (or default_host if you didn't create your own) from the Virtual Host drop-down list. Proceed to "Step 4".
  4. Enter a context root for the WAR file (e.g. /CmcApp for CmcApp.war) and press the Next button, followed by Continue.
  5.  Review the summary page, and press Finish when done.
  6. Click Save to Master Configuration.
  7. Click the Save link, then the Save button.
  8. Under the Applications heading of the console navigation menu, click Enterprise Applications on the left navigational pane.
  9. Verify that the WAR file was deployed, and then click the Start button. Repeat steps 1-11 for each WAR file to deploy.
  • Test

To test your deployment just open the browser and write the URL (EX. InfoView):

http://”BOservername”:”PortNumber”/InfoViewApp

 

If you have any questions or contributions, please leave a comment below.

Attend the Clariba Webinar "Why Migrate to SAP BusinessObjects BI 4?"

Do you wish to know more about the reasons to migrate to SAP BusinessObjects BI 4, the most advanced Business Intelligence platform?

Attend our Webinar on the 12th of April, from 11:00-12:00 CET (Presented in Spanish)

REGISTER HERE

SAP BusinessObjects BI 4 offers a complete set of functionalities that are key to today´s Business Intelligence market: an improved performance management, reports, search, analysis, data exploration and integration. This new version of SAP´s BI platform introduces several significant improvements to your BI environment, with a great number of functionalities designed to optimize performance.

With this in mind, Clariba invites you to invest an hour of your time to get to know the news and advantages of SAP BusinessObjects BI4, the most advanced BI platform.

The agenda of our webinar is the following:

  • Welcoming and introduction
  • What is new in SAP BusinessObjects BI 4
  • Benefits of migrating to SAP BusinessObjects BI 4
  • Why migrate with Clariba
  • Questions and answers

For more information about SAP BusinessObjects BI 4, visit our website.

Best Regards,

Lorena Laborda Business Development Manager - Clariba

 

Asista al Webinar ¿Porqué migrar a SAP BusinessObjects BI 4?

 

¿Desea conocer más acerca de los motivos para migrar a SAP BusinessObjects BI 4, la plataforma más avanzada de Business Intelligence?

Asista a nuestro Webinar el 12 de Abril de 11.00 a 12:00 (CET)

REGÍSTRESE AQUÍ

 

SAP BusinessObjects BI 4 es la primera y única plataforma de Business Intelligence (BI) que proporciona un completo abanico de funcionalidades claves en el actual mercado de BI: una mejor gestión del rendimiento, informes, consultas y análisis, exploración de datos e integración. Esta nueva versión de la plataforma de SAP introduce avances significativos en su entorno de BI,  con un gran número de prestaciones diseñadas para optimizar su rendimiento.

Con esto en mente, Clariba le invita a invertir una hora de su tiempo para conocer las novedades y ventajas de SAP BusinessObjects BI4, la plataforma más avanzada en el mercado de Business Intelligence.

La agenda para el webinar ¿Porqué migrar a SAP BusinessObjects BI 4? es la siguiente:

  • Introducción y bienvenida
  • Novedades en SAP BusinessObjects BI 4
  • Ventajas de Migrar a SAP BusinessObjects BI 4
  • Porque migrar con Clariba
  • Preguntas y respuestas

Para obtener más información acerca SAP BusinessObjects BI 4, visite nuestro sitio web. Saludos Cordiales,

Lorena Laborda Business Development Manager - Clariba

Applying Custom Security on Web Intelligence Documents

One of the main hurdles that a Web Intelligence Developer has to overcome is how to deal with data security.  Indeed, the search for data security remains the overriding concern for many companies trying to ensure the availability, integrity and confidentiality of the business information, protecting both the database from destructive forces and the unwanted actions or undesired data visualization of unauthorized users. In SAP BusinessObjects we have several ways to set up a security roadmap in terms of authorization data access, but this time I would like to speak about how to use custom security in our WebI documents by using a simple table, joins forced in the Universe Designer and WebI tool in order to show only the data that a user is authorized to see.

We have the following scenario: Imagine that we have a group with different levels of hierarchy in terms of data access levels. The higher you are in the organization more data you have access to. This way, first level within the hierarchy can see all the data, second level in the hierarchy can see his level data and levels below, but won´t have access to first level information, third level can see its own level data and levels below, but won´t have access to second and first level information…and so on.

 

Let´s now see a step by step approach on how to achieve this

First thing to do is an exercise of defining what the hierarchy structure is, specifying each individual´s level and the date he will therefore have access to. After that, we have to create a table in our data base where we will store groups, users and privileges. Key fields for this purpose are:

BO_User:  This will be checked against current users who have accesses to the webi.

Highest Level:  Level in the hierarchy where the user belongs to. For this example we will have 4 levels of organization where 0 is the highest level and 3 is the lowest.

Level_value:  This will be checked against the fact table.

Once we have the table with all the related data already stored it is time to map it in the SAP BusinessObjects meta layer.  For this purpose we have to import the affected universe and create a derived table which will retrieve all the related data for a given user (this means all the data that a user is able to see according to his data access level). The SQL code should be something like below:

SEL       BO_User, Level_Organization_3 FROM  CLARIBA.SECURITY a

LEFT JOIN

(SEL Level_Organization_0 , Level_Organization_1 , Level_Organization_2 , Level_Organization_3 FROM CLARIBA.FACT_TABLE GROUP BY 1,2,3,4) b

ON (

(Highest_Level=0 AND UPPER (a.level_value) = b.Level_Organization_0) OR (Highest_Level=1 AND UPPER (a.level_value) = b.Level_Organization_1) OR (Highest_Level=2 AND UPPER (a.level_value) = b.Level_Organization_2) OR (Highest_Level=3 AND UPPER (a.level_value) = b.Level_Organization_3) )

WHERE security_group=' CLARIBA'

 

This particular derived table will create a couple of objects which will be used in the WebI that we want to secure (BO_User and Level_Organization_3).

The third step is to develop and apply the security in the WebI where we want to carry out the data restriction.  For this purpose we have to create two new dimensions and one detail. Ensure that your query includes those newly created objects.

First task is discovering which users are trying to access the WebI. We can get their login by creating a new dimension named “BO_User” that contains the following formula:

 =CurrentUser()

 

Once we know who is trying to access WebI, we have to control if the BO_User matches with the User name that we had in our table.  We can create a dimension named “FlagBOuser” with the following formula:

=If(Lower([BO_User])=Lower([User Name]);1;0)

 

Next step is to control what level of data access this BO_user will have. In other words we are applying a kind of row/column level security. For this purpose we create a detail object named “Level_Organization” with the following code:

=If([FlagBOUser]=1;[ Level_Organization_3])

 

Once we have these infoobjects, the very last step is to drag and drop both FlagBOuser and Level_Organization as global filters at document level. This way we apply the data restriction to each single data block displayed in the report.

The conditions to be applied are simple: “FlagBOuser” must be equal to 1 meaning that a given user we have corresponds to a user in the database table and “Level Organization” is not null, meaning that we have data to be displayed.

At this point of the exercise, we should be able to restrict data contents displayed in the WebI according to a given user that wants to access it.

Last but not the least we can also control some particular cells such as subtotals information by creating a flag that will ensure only the employees that are allowed to are able to see this content.

=If(Lower([BOUser]) InList ("SilviaR";"JoseY”);1;0)

 

As we have seen in this example, this custom Security in WebI provides an alternative to other types of security that we can apply in our BO system (such as row/level security in Universe Designer).  We can achieve a pretty nice data security solution with simplicity, effectiveness and reduced maintenance requirements.

If you have any questions do not hesitate to leave a comment below.

Socializing your success - Interview with Marc Haberland

As part of the SAP Best Performance Challenge 2012, we have conducted an interview with Marc Haberland, managing director of Clariba on the challenges our company is facing, how we are dealing with them and how this has brought us success.

  • What is the greatest challenge you see in the market today and how is your firm dealing with it?

Naturally, the state of the global economy has also had an impact on our company. Along with increasing competition and a pressure on consultancy rates due to the shrinking number of projects, we are mainly faced with the delay of projects that were approved and inability to plan accurately as a result. This has required us to increase our own internal reporting and more frequent resource planning meetings combined with tighter financial control.

On the positive side, the challenges in the market have also required our customers to take a closer look at optimizing their business intelligence systems and related processes while choosing the best value for money. As a result, we have been very successful in helping companies optimize their BI investments and improve resource allocation by helping them deploy BI competency centers as opposed to decentralized BI. Focused solutions such as our 360 BI Assessment, our prepackaged solutions that help customers achieve faster ROI, as well as our focus on certifications, training and excellence have helped us build a pool of the best BI consultants in the market – a key to survival and continued growth.

  • What processes do you use for your business planning and for adapting your plan to current market conditions?

As many companies we started out managing our business with an Excel sheet. Given the need for better transparency, control and reporting over the past 18 months Clariba has made a major investment into a cloud-based ERP&CRM solution. This solution that touches every single aspect of our company now provides us with the information and processes we need to be successful. We have put a major effort on the BI aspects as well to ensure that we have the visibility we need to take rapid decisions. In fact, we have several customers interested in replicating some of our internal reporting solutions, such as our BI Project Management dashboard which includes earned value analysis and more.

  • How do you perceive the marketing plan using interactive online tools such as social media, social networks, blogs, or other digital media?

Social media in our business where we deal with a B2B interaction is still in its early stages. Yet behind every company we work with or target we find a group of amazing people that want to interact, feel taken care of and build a long-term trust relationship for the benefit of the company they work for. For this reason, we have embraced social media since about 2008 with blog articles, tweets, linkedin profile and lately our own youtube channel. For a consulting company we are very active in social media. In fact just recently the social media team of SAP invited Clariba to speak about the success that we have had with social media and the best practices we recommend to other SAP partners.

  • How does the Best Performance Challenge help your firm and its employees?

I believe the Best Performance Challenge is an excellent initiative as it has brought together a multidisciplinary team from Clariba to compete in a fun and engaging way. Not a week passes without our team eying the current results and where we stand! But not only is it a fun experience, it has forced many different people in the organization to stop their day-to-day activity and focus on a specific question, to learn and to provide new impulses that will ultimately serve Clariba and our relationship with SAP. We need to evolve and we need to continue learning. The Best Performance Challenge has enabled us to do just that!

Implementing Materialized Views in Oracle - Execute queries faster

Let's assume that you've been convinced by Marc's excellent article about the aggregate awareness dilemma, and that after balancing all the arguments you've decided to implement the aggregates in your Oracle database. Two parts are necessary: the materialized views and the query rewrite mechanism.

What is a materialized view?

Think of it as a standard view: it's also based on a SELECT query. But while views are purely logical structures, materialized views are physically created, like tables. And like tables, you can create indexes on them. But the materialized views can be refreshed (automatically or manually, we'll see that later) against their definitions.

Let's imagine the following situation: a multinational company manages the financial accounts of its subsidiaries. For each period (year + month) and for each company, many thousands of records are saved in the data warehouse (with an account code and a MTD (month to date) value). You'll find below a very simplified schema of this data warehouse.

What happens when we want to have the sum of all accounts for each period?

Without a materialized view, all the rows have to be retrieved so that the sum can be calculated. In my case, the following query takes around 2 seconds on my test database. The explanation plan tells me that more than 1 million records had to be read in the first place.

(Query 1)

select p.year, p.month, sum(a.mtd)

from dim_period p

join account_balance a on a.period_key = p.period_key

group by p.year, p.month

So how do you avoid this reading of more than 1 million records? A solution is to maintain aggregate tables in your database. But it means a bigger ETL and a more complex Universe with @aggregate_aware functions. Although this could be a valid option, we've chosen to avoid that..

Another solution is to create a materialized view. The syntax can be quite simple:

(Query MV-1)

CREATE MATERIALIZED VIEW MV_PERIODS

BUILD IMMEDIATE

ENABLE QUERY REWRITE

AS

select p.year, p.month, sum(a.mtd)

from dim_period p

join account_balance a on a.period_key = p.period_key

group by p.year, p.month

Let's go through the query lines.

  • CREATE MATERIALIZED VIEW MV_PERIODS => We simply create the view and give it the name MV_PERIODS.
  • BUILD IMMEDIATE => The materialized view will be built now
  • ENABLE QUERY REWRITE => If we don't specify this, then the materialized view will be created and could be accessed directly, but it wouldn't be automatically used by the query rewriting mechanism.
  • The "as select…" is the same as the original query we made.

You'll notice when executing this query that the time needed to create this materialized view is at least the time needed to execute the sub-query (+ some time needed to physically write the rows in the database). In my case it was 2.5 seconds, slightly more than the original 2 seconds.

If now I re-execute my original query, I get the same result set as before, but instead of 2 seconds I now need 16 milliseconds. So it's now 120 times faster! Oracle understood it could automatically retrieve the results from the materialized view. So it only read this table instead of doing of full read of the fact table.

 

The data freshness

Now imagine a new month is gone, and new rows have arrived in your data warehouse. You re-execute your original select query and at your great surprise, it takes a lot of time: 2 seconds! But why?

It is possible to ask Oracle to tell us if a query was rewritten with a given materialized view, and if not to give us the reasons. Let's see a possible syntax below.

SET SERVEROUTPUT ON;

DECLARE

Rewrite_Array SYS.RewriteArrayType := SYS.RewriteArrayType();

querytxt VARCHAR2(4000) := '

select p.year, p.month, sum(a.mtd)

from dim_period p, account_balance a

where a.period_key = p.period_key

group by p.year, p.month

';

no_of_msgs NUMBER;

i NUMBER;

BEGIN

dbms_mview.Explain_Rewrite(querytxt, 'MV_PERIODS',  Rewrite_Array);

no_of_msgs := rewrite_array.count;

FOR i IN 1..no_of_msgs

LOOP

DBMS_OUTPUT.PUT_LINE('>> MV_NAME  : ' || Rewrite_Array(i).mv_name);

DBMS_OUTPUT.PUT_LINE('>> MESSAGE  : ' || Rewrite_Array(i).message);

END LOOP;

END;

(The sections in red indicate which parts of the query you can update; the rest should stay as is).

Once I executed these lines, I got the following result:

>> MV_NAME  : MV_PERIODS

>> MESSAGE  : QSM-01150: query did not rewrite

>> MV_NAME  : MV_PERIODS

>> MESSAGE  : QSM-01029: materialized view, MV_PERIODS, is stale in ENFORCED integrity mode

(Technical note: to see these lines in the Oracle SQL Developer, you need to activate the DBMS output: menu View / DBMS Output and then click on the button 'Enable DMBS Output for the connection)

The line "materialized view, MV_PERIODS, is stale in ENFORCED integrity mode" means that the materialized view is not used because it does not have the right data anymore. So to be able to use the query rewrite process once again, we need to refresh the view with the following syntax:

BEGIN DBMS_SNAPSHOT.REFRESH('MV_PERIODS','C'); end;

Note that in certain situations, the final users may prefer having the data from yesterday in 1 second rather than the data of today in 5 minutes. In that case, choose the STALE_TOLERATED integrity mode (rather than the ENFORCED default) and the query will be rewritten even if the data in the materialized view is not fresh anymore.

 

Extend your materialized views

Now let's imagine that we want to have not only the account sums by periods, but also by company code. Our new SQL query is the following:

(Query 2)

select p.year, p.month, c.company_code, sum(a.mtd)

from dim_period p, account_balance a, dim_company c

where a.period_key = p.period_key

and a.company_key = c.company_key

group by p.year, p.month, c.company_code

Of course the materialized view MV_PERIODS doesn't have the necessary information (company key or company code) and cannot be used to rewrite this query. So let's create another materialized view.

(Query MV-3)

CREATE MATERIALIZED VIEW MV_PERIODS_COMPANIES

BUILD IMMEDIATE

ENABLE QUERY REWRITE

AS

select p.year, p.month, c.company_code, sum(a.mtd)

from dim_period p, account_balance a, dim_company c

where a.period_key = p.period_key

and a.company_key = c.company_key

group by p.year, p.month, c.company_code

So now our query takes a very short time to complete. But what if, after having deleted the MV_PERIODS materialized view, you try to execute the first query (the one without the companies)? The query rewrite mechanism will work as well! Oracle will understand that it can use the content of MV_PERIOD_COMPANIES to calculate the sums quicker.

Be aware that the query will only rewrite if you had created a foreign key relationship between ACCOUNT_BALANCE.COMPANY_KEY and DIM_COMPANY.COMPANY_KEY. Otherwise you'll get the following message:

QSM-01284: materialized view MV_PERIODS_COMPANIES has an anchor table DIM_COMPANY not found in query.

 

Is basing the materialized view on the keys an option?

The materialized views we've created are very interesting but still a bit static. You may ask yourself: wouldn't have it been a better idea to base the materialized view on the keys? For example with the following syntax:

(Query MV-4)

CREATE MATERIALIZED VIEW MV_PERIODS_COMPANIES_keys

BUILD IMMEDIATE

ENABLE QUERY REWRITE

AS

select period_key, company_key, sum(mtd)

from account_balance

group by period_key, company_key

The answer is "it depends". On the good side, this allows for a greater flexibility, as you're not limited to some fields only (as in the query MV-1 where you're limited to year and month). On the bad side, as you're not using any join, the joins will have to be made during the run-time, which has an impact on the performance query (but even then, the query time will be much better than without materialized views).

So if you want a flexible solution because you don't know yet which are the fields that the users will need, it's probably better to use the keys. But if you already know the precise queries which will come (for example for pre-defined reports), it may be worth using the needed fields in the definition of the materialized view rather than the keys.

If you have any doubts or further information on this topic, please leave a comment below.

Attend the Clariba Webinar "Why Migrate to SAP BusinessObjects BI 4?"

Do you wish to know more about the reasons why to migrate to SAP BusinessObjects BI 4, the most advanced Business Intelligence platform?

Attend our Webinar on the 13th of March, from 11:00-12:00 CET (Presented in Spanish)

REGISTER HERE

SAP BusinessObjects BI 4 offers a complete set of functionalities that are key to today´s Business Intelligence market: an improved performance management, reports, search, analysis, data exploration and integration. This new version of SAP´s BI platform introduces several significant improvements to your BI environment.

With this in mind, Clariba invites you to invest an hour of your time to get to know the news and advantages of SAP BusinessObjects BI4, the most advanced BI solution will provide your company with a great number of functionalities designed to optimize performance and bring you a scalable and secure platform.

The agenda of our webinar is the following:

  • Welcoming and introduction
  • What is new in SAP BusinessObjects BI 4
  • Benefits of migrating to SAP BusinessObjects BI 4
  • Why migrate with Clariba
  • Questions and answers

For more information about SAP BusinessObjects BI 4, visit our website www.clariba.com

Best Regards,

Lorena Laborda Business Development Manager - Clariba

 

Atienda al Webinar ¿Porqué migrar a SAP BusinessObjects BI 4?

 

¿Desea conocer más acerca de los motivos para migrar a SAP BusinessObjects BI 4, la plataforma más avanzada de Business Intelligence?

Asista a nuestro Webinar el 13 de Marzo de 11.00 a 12:00 (CET)

REGISTRESE AQUÍ

 

SAP BusinessObjects BI 4 es la primera y única plataforma de Business Intelligence (BI) que proporciona un completo abanico de funcionalidades claves en el actual mercado de BI: una mejor gestión del rendimiento, informes, consultas y análisis, exploración de datos e integración. Esta nueva versión de la plataforma de SAP introduce avances significativos en su entorno de BI.

Con esto en mente, Clariba le invita a invertir una hora de su tiempo para conocer las novedades y ventajas de SAP BusinessObjects BI4, la más avanzada plataforma en el mercado de Business Intelligence, hará que su compañía se beneficie de un gran número de prestaciones diseñadas para optimizar su rendimiento, ofrecer una plataforma escalable y 100% segura.

La agenda para el webinar ¿Porqué migrar a SAP BusinessObjects BI 4? es la siguiente:

  • Introducción y bienvenida
  • Novedades en SAP BusinessObjects BI 4
  • Ventajas de Migrar a SAP BusinessObjects BI 4
  • Porque migrar con Clariba
  • Preguntas y respuestas

Para obtener más información acerca SAP BusinessObjects BI 4, visite nuestro sitio web www.clariba.com Saludos Cordiales,

Lorena Laborda Business Development Manager - Clariba

Incident Management with SAP BusinessObjects exam, A suggested path of study.

Whether you want to expand your personal curriculum or your goal is to become a consultant for your company’s Support Center for SAP BusinessObjects, taking the Incident Managent with SAP BusinessObjects exam (booking code C_BOSUP_90) is a key step that you must take in order to become a certified consultant in this area.

If you have ever tried to obtain a SAP certification before, you will probably be familiar with the feeling of not knowing where to start or which strategy to take in order to study for the exam as efficiently as possible. It becomes a challenge to learn how to combine your study time with the normal day-to-day tasks from work. Therefore, what I will try to share here is a suggested path of study, based on my own experience, to face the Incident Managent with SAP BusinessObjects exam, required to become a certified SAP BusinessObjects Support Consultant.

What you must do before presenting the exam

In order to become a certified SAP BusinessObjects Support Consultant, you must first approve the Web Assessment Tests for several of BusinessObjects’ key areas, such as WebIntelligence, Universe Designer, BI Root Cause Analysis, among others. The materials for these tests are listed as Required in the Learning Plans that can be found in SAP Channel Partner Portal / Education / SAP BusinessObjects / Role-Based Training / Support Consultants. They are free of charge and relatively easy to undertake.

 What is this exam about?

So, first of all we need to know the details and structure of the exam (that can also be found in the following link of SAP Training Site: C_BOSUP_90 Booking Details). Basically, it consists of 80 questions to be answered in 180 minutes. The questions are focused in proving that “the candidate has a good overall understanding within this support consultant profile, and can apply this knowledge practically in the handling of client messages under guidance of an experienced support consultant”.

Also, you must be aware that this test is closely related with the SAP Solution Manager tool, so it is very advisable to have this product installed in your company, or at least find a way to gain some hands-on knowledge with it.

 

In what areas should you focus?

As in all exams, the best suggestion is to read all the learning material required at least once. Therefore, I find more useful at this point to highlight the topics contained in most of the documents where you should focus in more depth.

  • Message Solving, Problem Analysis and Providing Solutions to Customer:
    • One of the key documents that you will find in the learning material is the L1220 – Message Solving.  Here you should pay attention to the technical terms and SAP definitions explained since several questions are based on this part.
    • Even more important is the L1225 – Efficient Message Solving document, you must carefully read each Typical Situation given and the type of answers, information gathered and interactions that are recommended to have with a customer in each situation.
  • Message Processing: This section should not be confused with the Message Solving topic as they refer to different parts of the Support process.
    • The L1260 – Message Processing document contains several exam questions, so be sure to understand the different workflows in and out of Partner working hours, the different message statuses and how to access SAP Notes. Also, you will have some questions covered if you memorize the different SAP transactions that are explained and learn how to gather customer information before sending it to SAP Support Backbone when necessary.
    • On the other hand, the L1270 – Message Processing via Work Center and the L1275 – How to create a message documents contain a lot of information regarding the basic workflows of SAP Solution Manager and its Work Center. It is advisable to have a correct understanding of all of them since some questions are related to this.
  • VAR Service Desk:
    • In the L0120 – VAR Support enablement program document you should understand the key vocabulary explained as well as the tasks and responsibilities of employees in a VAR Support Team.
    • Also, the L0125 – Incident Management is a very important document where understanding the three task levels, the message flow and how High Priority messages are attended is absolutely key.
    • Finally, in L0155 – Mission Critical special attention must be given to the Service Level Agreements (SLA) in order to domain under which terms with SAP you will be operating as a Support Consultant.
  • Using SAP Enterprise Support: In the Providing Solutions section of the exam’s learning plan you will find a document that comes without a code, called SAP Service and Support. It refers basically of how partners benefit from SAP Service and Support, so at least you should give it a good read to ensure two or three questions from the exam.
  • Basic Understanding of SAP Solution Manager: This section is a complement of Message Processing.
    • The L2220 – EarlyWatch, Service Level and Solution Reporting is practically summarized in the document that I will explain next. However, give attention to understand definitions and transactions as well as understanding EarlyWatch Alert general purpose and type of alerts.
    • The last document with key information that you will encounter is the L2225 – Early Watch Alert, Overview. Several exam questions are related to its content, so understand the differences between EarlyWatch Alert and EarlyWatch Check, the frequency of both of them, how the checks are performed and finally, how the checks are included in the support agreement.

Final Thoughts

As always, it is recommended to see a set of sample questions to have a more practical idea of what is coming up. You can find some examples in the following link of SAP Education: C_BOSUP_90 Samples Questions

In my personal experience, I would say that this is not the most difficult certification exam that SAP offers, so I would really like to encourage you to read all the content at least once and then focus on the key topics mentioned above and that would leave you very well positioned to obtain a successful result.

I hope this article proves to be useful to get an overall understanding on how to approach this exam, and hopefully will help you to become a brand new SAP BusinessObjects Support Consultant! Good luck!

If you have any questions or anything to add in order to help improve this post, please feel free to leave your comments and share it with someone else if you found it helpful. Also, you can contact me via Twitter @IsaacGil_BI

 

Attach a Dashboard Screenshot to an Email with one “click”

It is impressive how far we can get during a project if we try to meet all our customers’ requirements, including those that seem somewhat complicated to solve. During one of our projects in Middle East we received one of such requests. Our customer was asking us to build a functionality to send screenshots of their dashboard by email. Fair enough.

We immediately thought of installing some PDF creator free tool and tell them to print to pdf and then attach the document to the email but there were too many steps according to our customer. We needed to achieve this functionality with a single “click”.

Within a couple of hours and some emails sent to my colleagues Pierre-Emmanuel Larrouturou and Lluis Aspachs, we were then working on a solution meant to work with open source software and free tools that we found on google.

Below are the steps we followed to achieve the goal:

We created the exe file that makes the snapshot and attached it to an email

  • It looks for C:/Temp or D:/Temp folders to save the image
  • It looks for Outlook (Office 2003, 2007 or 2010) both in C:/ and D:/ Drive
  • We added the Xcelsius_burst.bat to skip the windows to authorize the launch of the exe
  • We saved the two files within C:/ Drive but it can be added also to D:. if the user creates a dedicated folder only the .bat file needs to be edited
  • We added the bat file path to a URL button in Xcelsius and run it

Notes: please check your browser options to avoid the bat popups if they are a problem. This version only works if installed within each customer machine. If you want to install it into a server (to avoid the multiple installations) you can create a more complex solution using the Pstools available for free in the network and adding it to your web server (in our case it was tomcat).

 

You can download the files by clicking on the link below. This solution is quite simple but it made our customer quite happy.

Dashboard Burst

 

Just to add more value to the article, there is another way to crack this issue: we are also adding below the latest version of the feature Dashboard_by_email.exe, which allows any screenshot (not only from Dashboards) to be automatically attached to emails. The program needs to run at windows startup and the user can get the screenshot directly attached to his/her email by pressing CTRL+ALT+D. Click on the link below to download.

Dashboard by email

 

We are also aware that the market is now offering add-ons for Dashboard Design which can also meet this and other requirements. You can check out what our friends at Data Savvy Tools (http://datasavvytools.com/) created for dashboard printing. We have tested their component that allows the selection of dashboard components to be printed out (and it´s great).

Let us know your comments and we will be more than happy to discuss these solutions with you.