""

SMEs Run SAP - Myth 1: You Have to be Big to Work with SAP

Myths and misnomers litter the business world. All small businesses are agile, all enterprises are big corporate entities, sales figures are the most important, customers will always buy the best product. As we all know, these don’t stand up to scrutiny. Some large businesses are agile, some small businesses aren’t, cash flow is more important than sales and the best product isn’t always the most popular. Business people can’t afford to make assumptions and to blindly follow these misconceptions. That’s why we do our research and base decisions on facts. We started a series of blog post about myths that surround SAP, the platform that Clariba chose to bring Business Intelligence (BI) to our customers. This first one intends to explain why you DON´T have to be an enterprise to work with SAP and SAP BusinessObjects.

SMEs can run SAP

SAP actually works with a client base composed by a  majority of small to medium-sized enterprises (SMEs). SAP software combined with Clariba´s implementation expertise allows them to automate processes and gain valuable insight into their businesses. By doing this, these businesses take assumptions out of the equation – it means that instead of relying on anecdotal evidence, they work with the facts. The result is they find it easier to spot opportunities and identify areas to improve.

However, before these businesses start dealing with the facts, they often need to challenge another set of preconceptions. One Clariba comes across frequently is that SAP only works with enterprise organizations. But this isn’t the case. More than 77% of SAP’s customers are SMEs. That’s about 88,000 businesses! What’s more, SAP has been working with SMEs for nearly 40 years. As a result, the company understands the unique challenges they face and the tools they need to grow.

Clariba´s work is to translate that to the customers, helping them match their requirements to the best SAP tool available. Our audits help SME´s to get an idea of where they are in terms  of their BI environment and the benefits they could enjoy with Clariba and SAP:

  • our accessible pre-packaged solutions allows for rapid deployment

  • streamline processes by automating reporting tasks

  • improve data quality, reducing misleading information that leads to poor decision making

  • uncover hidden  facts and make insights available to take the best course of action

  • help small companies improve management of different departments

If you are an SME and want to know more about SAP Solutions, contact us.

Problem Uninstalling Data Services

I have faced a problem uninstalling Data Services  recently and I wanted to share the resolution, just in case you find the same problem. I was trying to upgrade a Data Services machine following SAP procedure (this is copying the configuration files uninstall and then install the new version – not very sophisticated as you can see). This was not as simple as I first thought.

Problems started after uninstalling the software, the new version refused to install stating that I should first uninstall the previous version. I uninstalled the software again… but Data Services is still there, so uninstalled again, but this time the process failed (makes sense as the software is already uninstalled), so I kept trying… reboot…uninstall… reboot…rename older path name… reboot…you see where this is going…

 

So, how did I finally solve this?

  1. Start Registry Editor (type regedit in a command window or in the Execute dialog).
  2. Take a backup of the current Registry content. To do this, with the top node of the registry (Computer) selected go to File -> Export and select a name for the backup file.
  3. Delete the Key: HKEY_LOCAL_MACHINESOFTWAREBusiness ObjectsSuite 12.0EIM (Suite XX.X may vary).  NOTE: You may want to write down the key KEY_LOCAL_MACHINESOFTWAREBusiness ObjectsSuite 12.0EIMKeycode first as it contains the license code.
  4. To remove the entry for the software in the Uninstall Window’s dialog, go to HKEY_LOCAL_MACHINESOFTWAREMicrosoftWindows CurrentVersionUninstall and look for a KEY which property DisplayName is “BusinessObjects Data Services.
  5. Finally delete the content of the installation directory (typically: C:Program FilesBusiness ObjectsBusiness Objects Data Services)

Now you can launch the installer and it should work.

Hope this may help you if in case you are experiencing the same issue. If you have any doubts or if you ever faced the same issue, leave a comment below.

 

Exxova Announces Clariba as New Channel Partner

Exxova Worldwide Corp., a Georgia-based business intelligence (BI) solutions provider, on the 16th of April announced a value-added reseller agreement with Dubai-based Clariba  to resell Exxova's MyBI Mobile product as part of Clariba's solution range. The MyBI Mobile technology allows Clariba customers the ability to meet the ever-growing demands of business users to access actionable information through either iOS or Android mobile devices. “We are pleased to have Clariba join the growing Exxova partner network and act as a value-added reseller through its operations in Spain, Europe, and the Middle East,” said Todd Baker, vice president, Global Channel Sales, Exxovaa.  “Businesses are wanting a mobile BI solution without having to redevelop, redesign, and re-platform their BI content.  Clariba – an SAP® channel partner – provides leading support in the implementation of SAP® BusinessObjects™ business intelligence (BI) solutions and provides validation of the need for full business intelligence on a mobile device.”

"We believe that it’s important for the entire SAP BusinessObjects BI portfolio to run on multiple mobile platforms," said Marc Haberland, managing director of Clariba. "Our customers have been asking for SAP BusinessObjects Dashboards to run on iOS, which Exxova delivers. In addition, Exxova’s MyBI Mobile solution is a very cost-effective, easy-to-use option to deliver existing SAP BI content to decision-makers on a tablet device without compromising data security.

Having launched the MyBI Mobile product to the market in 2011, the MyBI product now provides customers the ability to go mobile with SAP BusinessObjects content, including SAP BusinessObjects Dashboards, without limitations currently found in iOS and Android devices.

“Users can expect the same access to the reports, documents, and dashboards that they have on their PCs,” said Glenn Hillam, senior solution architect, Exxova.  “They can print, change, update, and do all of the work that they currently perform on their enterprise systems through a mobile device. The difference is that the app allows users to access their enterprise information at the decision point.”

For more information on Exxova’s MyBI Moble app, please visit Exxova’s website at  www.exxova.com.

 

About Exxova

Exxova Worldwide Corporation is a global technology company delivering high quality products and services through innovative solutions, methodologies and tools. With a relentless focus on innovation, Exxova differentiates from the norm and brings unique, creative and flexible ways of accomplishing complicated initiatives.  Exxova is a recognized thought leader in business intelligence and also provides solutions in data warehousing and enterprise consulting.  Headquartered in Atlanta, Georgia, Exxova has operations in the UK, Singapore, and India.  For more information on Exxova’s solutions, including the MyBI Mobile app, please visit www.exxova.com or contact us at +1 770-343-8484.

About Clariba

Clariba, founded in 2002, is an expert provider of business intelligence services, training and solutions across many industries including telecommunications, high technology, banking, manufacturing, retail, oil & gas, public sector and education. Clariba works with the most advanced BI technologies and works closely with SAP as a channel partner. It is recognized internationally as a leading BI consultancy offering support of SAP BusinessObjects solutions.

With a focus on best-practice BI implementations and its rapidly deployable FastTrack solutions for enterprise performance management, planning & analysis, and operational reporting, Clariba provides its customers with the clarity and actionable insight to improve their business performance.

Clariba has a European office in Barcelona, Spain and Middle East offices in Dubai, UAE and Doha, Qatar. For more information on Clariba, please visit www.clariba.com or contact us at +971 4431 5011

Installing Data Services 4.0 in a Distributed Environment

Following my first experiences using SAP BusinessObjects platform 4.0, I decided to write this article after spending 3 days to discover how to install Data Services 4.0 in a distributed architecture. Why did I spend so much time to figure this out? There is something different in the new Data Services: SAP has tried to unify the security and the control of the servers using the CMC. With this new feature we will be able to manage Data Services users and services using the CMC instead of using the Data Services Management Console. This has brought slight changes to the installation process of Data Services 4.0 in comparison with other releases.

I was working with a distributed architecture, which means that I was planning to install the SAP BusinessObjects platform (including Live Office, Dashboard Designer 4.0, Client Tools and Explorer) in one server, let’s say “ServerA”, and Data Services 4.0 in a separate server named “ServerB”.

If you have installed older version of Business Objects such as XI R2 or XI3.0 and 3.1, you know that we would install BusinessObjects platform in ServerA and Data Services in the ServerB and then we would see that we don’t have Data Services intragrated with the rest of the platform and probably we would see the same in the CMC that is shown in the image below.

As you can see this image comes from a SAP note that explains how to solve this problem on past releases, but if you installed Data Services 4.0 in a distributed architecture, the error won’t be solved using the solution described in the SAP note 1615646.

Once the scenario is clear let’s start with the process to install Data Services in a distributed architecture. Before starting ensure that BusinessObjects 4.0 platforms and the client tools with their latest service packs and patches are installed in the ServerA.

 

Step 1: Install Data Services 4.0 in ServerA

As in every new software that has to be installed the first step is uncompress (if needed) the file downloaded from SAP service market place in the ServerA. Open the root folder, go to data_unit and run the setup.exe. After that chose the language to use during the installation.

After these “typical” steps the installation program checks for required components. Review the results and decide whether to continue with the installation, or abort and correct any unmet requirements.

In the next three steps you have to review the SAP recommendations, accept the license agreement and finally, fill the gaps with the license key code, name and company. Once the license key is verified you can chose the support language packs you wish to install.

Then it is time to choose the folder, but in this case, the wizard doesn’t let you choose because BO platform was install before so it takes the BO installation folder as a default.

After that you have to configure CMS information. In this case our ServerA is going to be named “MVBOBITEST01”

This is the important step! The next screen invites you to select what you want to install. In ServerA you have to install all (if needed) Data Services’s features apart from Designer, Job Server and Access server. See the image below.

The subsequent screen will ask for merging with an existing configuration. In this case no existing configuration can be reused so the answer is: skip configuration.

Then you can choose if you want to use an existing data base server or you want if you want to skip this part and do it after the installation. Chose what suits you best. Imagine that you or your vendor doesn’t have the database or schemas for the CMS repositories ready when you were planning to install Data Services. In that case, you can configure the CMS parameters after the installation without problems.

After filling in the information for the CMS systems data base and the Audit database (if required), the Metadata Integrator configuration starts. I am not going to describe this step in depth because it hasn’t got an impact on our installation. Furthermore, configuring Metadata Integrator is not  difficult; as always you only have to choose the ports, folders and the name.

Once we have finished installing Metadata integrator and ViewData, the installation will start. After the installation process we can proceed to the next step.

Step 2: Install Data Services 4.0 in ServerB

After a few hours installing the first part of Data Services we are ready to install Data Services in a dedicated server which is going to have the ETL only.

Again it is time to uncompress the file you downloaded before, run the DVD or whatever is the installer you are using. Run the setup.exe.

Once the installation starts, we are going to repeat the same firsts steps mentioned before until we reach the screen to specify CMS information. Add the CMS information related with the ServerA or “MVBOBITEST01”. Why? Because we don’t have CMS installed in the ServerB.

Choose the components that we did not choose before during the BusinessObjects server installation: Job Server, Access server and Designer.  

As we did in the ServerA installation choose skip the configuration if no previous configuration exists.

In the next step you have to configure the account on which you would like to run Data Services. You can choose to run it using a system account or run it with a specific service account previously defined. What is good about using a service account rather than a system account is that if you want to stop a service (like Data Services Job Server) you need the password of this account, because it is not related to the system account with which you log onto the S.O.

Then configure the Access Server. In this case I kept the default values.

After the last step the wizard asks to start the installation. After a couple of hours you will have Data Services 4.0 running.

There is another important point after the installation process that I will be covering on my next article due on the beginning of May:  the repository configuration. One of the new and best features of Data Services 4.0 is the integration with the BO platform using the CMC, which results in a complete integration when you configure the repositories properly.

If you have any questions or other tips, share it with us by leaving a comment below.

How to convert a Universe to Multi-source in SAP BusinessObjects BI4

The new BI 4 offers a very powerful meta-layer capability: a single Universe can sit on top of several data-sources with great benefits offered by a real-time integration. At a first glance, you might think that existing Universes would need to be typed from scratch, but this article explains how to re-use an existing Universe to provide this highly scalable and expandable meta-layer.

The multi-source Universe

A multi-source Universe is now designed as a project with the following components:

  • Connections
  • Data Foundation
  • Business Layer

These items can be created and configured separately, and then be connected one with another. The cycle of creating a new Universe is easy because the connections, data foundation and business layer can be created intuitively and always using the common SQL language, so no need to know native connection peculiarities. Once built, what happens behind the scenes is transparent to the end user and he/she will see that Business Objects will produce a query which takes pieces of information from different sources in real-time.

However, while the creation process is quite simple when generating a new Universe from scratch, it is not so straightforward if we are migrating from a legacy universe. Let’s see why.

UNV to UNX conversion process

In our experience, the three steps to be completed are the following:

  • Legacy Universe (UNV) import: Using the standard migration process the legacy Universe can be inserted into the new BI4 platform. This can be done in a very short time and it has the following quick advantages:
    • Migrated Web Intelligence reports will still sit on top of this legacy meta-layer.
    • Live Office BI4, Crystal 2011 and other client tools can continue to perform as these are still using this format.

But we still cannot use platform modules like Explorer BI4 or Crystal Enterprise, or use the new security model or the new features advantages of the new Information Design platform, so the natural next step is to enable this.

  • New Universe (UNX) conversion: From the Information Design tool we will click on File, “Convert .unv universe” and a new UNX universe is provided, with a project containing the three main items: Connection, Data Foundation and Business Layer. The advantages are the ones we previously stated, but there is one big disadvantage: The automatically generated Data Foundation is mono-source type, so the resulting Universe will not be scalable, and there is no easy way of turning a Data Foundation from mono to multi-source. Therefore this will need to be re-built. The process for re-building the Universe is explained simply in the following step.
  • New Universe (UNX) multi-source conversion:

A new Data Foundation shall be created, following the steps stated below:

  • Define connections
  • Create new Data Foundation
  • Copy and paste items to the new Data Foundation and/or re-type tables and joins using the standard SQL language.

Also the Business Layer needs changes, basically to be re-pointed to the new Data Foundation. The recommended steps are:

  • Re-point the Business layer to the new Data Foundation
  • The calls from the objects to the tables will need to be re-typed using the standard SQL language

A limitation in this stage is that the useful “View Associated Table” feature that showed the table lineage from a certain object has disappeared, so this might become quite a manual work. Opening the Universe Design tool in paral.lel with the Information Design tool to get the lineage might help here.

Once this is done, verify and export this new universe.

As a final step, the WebI reports can now be re-pointed to the new multi-source UNX so they can be enhanced with new alternative data.

Process summary

See in the following diagram a summary of the process:

  • Step 1: Legacy Universe import
  • Step 2: New Universe UNX conversion
  • Step 3: New Universe UNX multi-source conversion

UNV to UNX conversion process summary

UNV to UNX conversion process summary
UNV to UNX conversion process summary

Conclusion

In the short term, in BI4 it should become a common practice to have 3 versions of the same universe:

  • UNV: To preserve the legacy WebI reports and to use certain client tools like Crystal 2011 or Live Office.
  • UNX mono-source: To use certain platform tools like Explorer or Crystal Enterprise and to have a higher level functionality.
  • UNX multi-source: To use certain platform tools like Explorer or Crystal Enterprise, have a higher level functionality and be able to use several sources in one Universe.

Mid-term only this last multi-source version should remain.

Benefits

This Universe conversion method is time-efficient as it reuses all existing folders and objects, and shows tips for a better Universe re-creation.

The multi-source Universe gives superior benefits to the end user, providing a real-time integration and report design simplicity which will make their life easier. It also helps the meta-layer designers who will see their development time reduced with the help of the new design panel functionalities and a common standard language which is easier to understand. Project managers and architects can also consider the fact that they do not have to build a professional Data Warehouse for their projects, and with all this, IT managers will see a quick ROI and lower TCO for their investments.

If you have questions about this method or about the new Information Design Tool in SAP BI4, or if you want to share your experience or tips, please feel free to leave a comment!

How to deploy SAP BusinessObjects 3.1 Web Applications with IBM Websphere

As we all know Tomcat and IIS are the most commonly used tools to deploy web applications (E.g. InfoView, CMC,..) in SAP BusinessObjects and that this deployment can done automatically through BusinessObjects server installation. However, SAP BO allows you to do perform this with other Applications. In this article I will speak about how we can deploy SAP BusinessObjects Web applications using IBM Websphere.

First off all we have to agree that any web deployment apart from those done with Tomcat and IIS should be done manually.

  • Supported Application

SAP BusinessObjects 3.1 is supported by IBM by using Websphere 6 Express Edition or 6 ND Edition.

  • Installation
    • Make sure that the IBM Wepsphere has been installed successfully in the machine and that all the services are up and running.
    • During the SAP BusinessObjects server installation, when you rich the web deployment part DO NOT SELECT any of the options to deploy Tomcat or IIS, just check the box to deploy the web application manually later.
  • Web configuration file
    • The wdeploy configuration file is:

<BO_insall_dir>deploymentconfig.websphere6.

    • Modify the config.websphere6 file (lines to be modified are in Bold).

config.websphere6 file:

# as_dir: the installation directory of the application server

as_dir=C:Program FilesIBMWebSphereAppServer

# as_instance: the application server instance to deploy to

as_instance=server1

# as_virtual_host: the virtual host the applications will be bound to

as_virtual_host=default_host

# as_soap_port: the SOAP administration port to the administration server.

#   If the value is not set (if the line is commented out), the default value is used.

as_soap_port=8880

# as_admin_is_secure (default: false): is security activated in websphere?

#   Security is activated when an user wishing to log into the admin portal has to provide

#   an username and a password. When secutiry is NOT activated, it is not necessary to

#   provide as_admin_username and as_admin_password (the lines can be commented out)

as_admin_is_secure=false

as_admin_username=admin

#as_admin_password=%AS_ADMIN_PASSWORD%

# ws_instance: the web server instance that will serve the requests, in distributed mode

#ws_instance=webserver1 (TO BE USED IF web server is installed in SPLIT mode)

## Don't remove next line

enforce_file_limit=true

 

  • Command used to deploy the applications

To deploy the web application use the command line (CMD) to write the command in the BO server, the command is:

“wdeploy  config.websphere6 deployall”

This will deploy all the BO Web Applications onto the IBM WebSphere server, the process will take about 20 Minutes to deploy.  17 applications are installed.

  • Deploying web application with websphere administration console

Ensure that your WebSphere web application server is installed, configured and running before deploying WAR files.

  1. Log in to the "WebSphere Application Server Administrative" console using the following URL: http://WAS_HOSTNAME:PORT/admin The WebSphere admin console's default port number is 9060. Give a unique name for your web application and proceed to "Step 2".
  2. Under the Applications heading of the console navigation menu, click Enterprise Applications on the left navigational pane. Highlight the server you created (or highlight server1 if you didn't create your own) from the Clusters and Servers and enable the "Select checkbox". Proceed to "Step 3"
  3. Click the Install button and navigate to the location of the WAR file to deploy. If deploying from a remote file system, select the option "Remote File System". Select the virtual host you created (or default_host if you didn't create your own) from the Virtual Host drop-down list. Proceed to "Step 4".
  4. Enter a context root for the WAR file (e.g. /CmcApp for CmcApp.war) and press the Next button, followed by Continue.
  5.  Review the summary page, and press Finish when done.
  6. Click Save to Master Configuration.
  7. Click the Save link, then the Save button.
  8. Under the Applications heading of the console navigation menu, click Enterprise Applications on the left navigational pane.
  9. Verify that the WAR file was deployed, and then click the Start button. Repeat steps 1-11 for each WAR file to deploy.
  • Test

To test your deployment just open the browser and write the URL (EX. InfoView):

http://”BOservername”:”PortNumber”/InfoViewApp

 

If you have any questions or contributions, please leave a comment below.

Attend the Clariba Webinar "Why Migrate to SAP BusinessObjects BI 4?"

Do you wish to know more about the reasons to migrate to SAP BusinessObjects BI 4, the most advanced Business Intelligence platform?

Attend our Webinar on the 12th of April, from 11:00-12:00 CET (Presented in Spanish)

REGISTER HERE

SAP BusinessObjects BI 4 offers a complete set of functionalities that are key to today´s Business Intelligence market: an improved performance management, reports, search, analysis, data exploration and integration. This new version of SAP´s BI platform introduces several significant improvements to your BI environment, with a great number of functionalities designed to optimize performance.

With this in mind, Clariba invites you to invest an hour of your time to get to know the news and advantages of SAP BusinessObjects BI4, the most advanced BI platform.

The agenda of our webinar is the following:

  • Welcoming and introduction
  • What is new in SAP BusinessObjects BI 4
  • Benefits of migrating to SAP BusinessObjects BI 4
  • Why migrate with Clariba
  • Questions and answers

For more information about SAP BusinessObjects BI 4, visit our website.

Best Regards,

Lorena Laborda Business Development Manager - Clariba

 

Asista al Webinar ¿Porqué migrar a SAP BusinessObjects BI 4?

 

¿Desea conocer más acerca de los motivos para migrar a SAP BusinessObjects BI 4, la plataforma más avanzada de Business Intelligence?

Asista a nuestro Webinar el 12 de Abril de 11.00 a 12:00 (CET)

REGÍSTRESE AQUÍ

 

SAP BusinessObjects BI 4 es la primera y única plataforma de Business Intelligence (BI) que proporciona un completo abanico de funcionalidades claves en el actual mercado de BI: una mejor gestión del rendimiento, informes, consultas y análisis, exploración de datos e integración. Esta nueva versión de la plataforma de SAP introduce avances significativos en su entorno de BI,  con un gran número de prestaciones diseñadas para optimizar su rendimiento.

Con esto en mente, Clariba le invita a invertir una hora de su tiempo para conocer las novedades y ventajas de SAP BusinessObjects BI4, la plataforma más avanzada en el mercado de Business Intelligence.

La agenda para el webinar ¿Porqué migrar a SAP BusinessObjects BI 4? es la siguiente:

  • Introducción y bienvenida
  • Novedades en SAP BusinessObjects BI 4
  • Ventajas de Migrar a SAP BusinessObjects BI 4
  • Porque migrar con Clariba
  • Preguntas y respuestas

Para obtener más información acerca SAP BusinessObjects BI 4, visite nuestro sitio web. Saludos Cordiales,

Lorena Laborda Business Development Manager - Clariba

Applying Custom Security on Web Intelligence Documents

One of the main hurdles that a Web Intelligence Developer has to overcome is how to deal with data security.  Indeed, the search for data security remains the overriding concern for many companies trying to ensure the availability, integrity and confidentiality of the business information, protecting both the database from destructive forces and the unwanted actions or undesired data visualization of unauthorized users. In SAP BusinessObjects we have several ways to set up a security roadmap in terms of authorization data access, but this time I would like to speak about how to use custom security in our WebI documents by using a simple table, joins forced in the Universe Designer and WebI tool in order to show only the data that a user is authorized to see.

We have the following scenario: Imagine that we have a group with different levels of hierarchy in terms of data access levels. The higher you are in the organization more data you have access to. This way, first level within the hierarchy can see all the data, second level in the hierarchy can see his level data and levels below, but won´t have access to first level information, third level can see its own level data and levels below, but won´t have access to second and first level information…and so on.

 

Let´s now see a step by step approach on how to achieve this

First thing to do is an exercise of defining what the hierarchy structure is, specifying each individual´s level and the date he will therefore have access to. After that, we have to create a table in our data base where we will store groups, users and privileges. Key fields for this purpose are:

BO_User:  This will be checked against current users who have accesses to the webi.

Highest Level:  Level in the hierarchy where the user belongs to. For this example we will have 4 levels of organization where 0 is the highest level and 3 is the lowest.

Level_value:  This will be checked against the fact table.

Once we have the table with all the related data already stored it is time to map it in the SAP BusinessObjects meta layer.  For this purpose we have to import the affected universe and create a derived table which will retrieve all the related data for a given user (this means all the data that a user is able to see according to his data access level). The SQL code should be something like below:

SEL       BO_User, Level_Organization_3 FROM  CLARIBA.SECURITY a

LEFT JOIN

(SEL Level_Organization_0 , Level_Organization_1 , Level_Organization_2 , Level_Organization_3 FROM CLARIBA.FACT_TABLE GROUP BY 1,2,3,4) b

ON (

(Highest_Level=0 AND UPPER (a.level_value) = b.Level_Organization_0) OR (Highest_Level=1 AND UPPER (a.level_value) = b.Level_Organization_1) OR (Highest_Level=2 AND UPPER (a.level_value) = b.Level_Organization_2) OR (Highest_Level=3 AND UPPER (a.level_value) = b.Level_Organization_3) )

WHERE security_group=' CLARIBA'

 

This particular derived table will create a couple of objects which will be used in the WebI that we want to secure (BO_User and Level_Organization_3).

The third step is to develop and apply the security in the WebI where we want to carry out the data restriction.  For this purpose we have to create two new dimensions and one detail. Ensure that your query includes those newly created objects.

First task is discovering which users are trying to access the WebI. We can get their login by creating a new dimension named “BO_User” that contains the following formula:

 =CurrentUser()

 

Once we know who is trying to access WebI, we have to control if the BO_User matches with the User name that we had in our table.  We can create a dimension named “FlagBOuser” with the following formula:

=If(Lower([BO_User])=Lower([User Name]);1;0)

 

Next step is to control what level of data access this BO_user will have. In other words we are applying a kind of row/column level security. For this purpose we create a detail object named “Level_Organization” with the following code:

=If([FlagBOUser]=1;[ Level_Organization_3])

 

Once we have these infoobjects, the very last step is to drag and drop both FlagBOuser and Level_Organization as global filters at document level. This way we apply the data restriction to each single data block displayed in the report.

The conditions to be applied are simple: “FlagBOuser” must be equal to 1 meaning that a given user we have corresponds to a user in the database table and “Level Organization” is not null, meaning that we have data to be displayed.

At this point of the exercise, we should be able to restrict data contents displayed in the WebI according to a given user that wants to access it.

Last but not the least we can also control some particular cells such as subtotals information by creating a flag that will ensure only the employees that are allowed to are able to see this content.

=If(Lower([BOUser]) InList ("SilviaR";"JoseY”);1;0)

 

As we have seen in this example, this custom Security in WebI provides an alternative to other types of security that we can apply in our BO system (such as row/level security in Universe Designer).  We can achieve a pretty nice data security solution with simplicity, effectiveness and reduced maintenance requirements.

If you have any questions do not hesitate to leave a comment below.

Socializing your success - Interview with Marc Haberland

As part of the SAP Best Performance Challenge 2012, we have conducted an interview with Marc Haberland, managing director of Clariba on the challenges our company is facing, how we are dealing with them and how this has brought us success.

  • What is the greatest challenge you see in the market today and how is your firm dealing with it?

Naturally, the state of the global economy has also had an impact on our company. Along with increasing competition and a pressure on consultancy rates due to the shrinking number of projects, we are mainly faced with the delay of projects that were approved and inability to plan accurately as a result. This has required us to increase our own internal reporting and more frequent resource planning meetings combined with tighter financial control.

On the positive side, the challenges in the market have also required our customers to take a closer look at optimizing their business intelligence systems and related processes while choosing the best value for money. As a result, we have been very successful in helping companies optimize their BI investments and improve resource allocation by helping them deploy BI competency centers as opposed to decentralized BI. Focused solutions such as our 360 BI Assessment, our prepackaged solutions that help customers achieve faster ROI, as well as our focus on certifications, training and excellence have helped us build a pool of the best BI consultants in the market – a key to survival and continued growth.

  • What processes do you use for your business planning and for adapting your plan to current market conditions?

As many companies we started out managing our business with an Excel sheet. Given the need for better transparency, control and reporting over the past 18 months Clariba has made a major investment into a cloud-based ERP&CRM solution. This solution that touches every single aspect of our company now provides us with the information and processes we need to be successful. We have put a major effort on the BI aspects as well to ensure that we have the visibility we need to take rapid decisions. In fact, we have several customers interested in replicating some of our internal reporting solutions, such as our BI Project Management dashboard which includes earned value analysis and more.

  • How do you perceive the marketing plan using interactive online tools such as social media, social networks, blogs, or other digital media?

Social media in our business where we deal with a B2B interaction is still in its early stages. Yet behind every company we work with or target we find a group of amazing people that want to interact, feel taken care of and build a long-term trust relationship for the benefit of the company they work for. For this reason, we have embraced social media since about 2008 with blog articles, tweets, linkedin profile and lately our own youtube channel. For a consulting company we are very active in social media. In fact just recently the social media team of SAP invited Clariba to speak about the success that we have had with social media and the best practices we recommend to other SAP partners.

  • How does the Best Performance Challenge help your firm and its employees?

I believe the Best Performance Challenge is an excellent initiative as it has brought together a multidisciplinary team from Clariba to compete in a fun and engaging way. Not a week passes without our team eying the current results and where we stand! But not only is it a fun experience, it has forced many different people in the organization to stop their day-to-day activity and focus on a specific question, to learn and to provide new impulses that will ultimately serve Clariba and our relationship with SAP. We need to evolve and we need to continue learning. The Best Performance Challenge has enabled us to do just that!

Implementing Materialized Views in Oracle - Execute queries faster

Let's assume that you've been convinced by Marc's excellent article about the aggregate awareness dilemma, and that after balancing all the arguments you've decided to implement the aggregates in your Oracle database. Two parts are necessary: the materialized views and the query rewrite mechanism.

What is a materialized view?

Think of it as a standard view: it's also based on a SELECT query. But while views are purely logical structures, materialized views are physically created, like tables. And like tables, you can create indexes on them. But the materialized views can be refreshed (automatically or manually, we'll see that later) against their definitions.

Let's imagine the following situation: a multinational company manages the financial accounts of its subsidiaries. For each period (year + month) and for each company, many thousands of records are saved in the data warehouse (with an account code and a MTD (month to date) value). You'll find below a very simplified schema of this data warehouse.

What happens when we want to have the sum of all accounts for each period?

Without a materialized view, all the rows have to be retrieved so that the sum can be calculated. In my case, the following query takes around 2 seconds on my test database. The explanation plan tells me that more than 1 million records had to be read in the first place.

(Query 1)

select p.year, p.month, sum(a.mtd)

from dim_period p

join account_balance a on a.period_key = p.period_key

group by p.year, p.month

So how do you avoid this reading of more than 1 million records? A solution is to maintain aggregate tables in your database. But it means a bigger ETL and a more complex Universe with @aggregate_aware functions. Although this could be a valid option, we've chosen to avoid that..

Another solution is to create a materialized view. The syntax can be quite simple:

(Query MV-1)

CREATE MATERIALIZED VIEW MV_PERIODS

BUILD IMMEDIATE

ENABLE QUERY REWRITE

AS

select p.year, p.month, sum(a.mtd)

from dim_period p

join account_balance a on a.period_key = p.period_key

group by p.year, p.month

Let's go through the query lines.

  • CREATE MATERIALIZED VIEW MV_PERIODS => We simply create the view and give it the name MV_PERIODS.
  • BUILD IMMEDIATE => The materialized view will be built now
  • ENABLE QUERY REWRITE => If we don't specify this, then the materialized view will be created and could be accessed directly, but it wouldn't be automatically used by the query rewriting mechanism.
  • The "as select…" is the same as the original query we made.

You'll notice when executing this query that the time needed to create this materialized view is at least the time needed to execute the sub-query (+ some time needed to physically write the rows in the database). In my case it was 2.5 seconds, slightly more than the original 2 seconds.

If now I re-execute my original query, I get the same result set as before, but instead of 2 seconds I now need 16 milliseconds. So it's now 120 times faster! Oracle understood it could automatically retrieve the results from the materialized view. So it only read this table instead of doing of full read of the fact table.

 

The data freshness

Now imagine a new month is gone, and new rows have arrived in your data warehouse. You re-execute your original select query and at your great surprise, it takes a lot of time: 2 seconds! But why?

It is possible to ask Oracle to tell us if a query was rewritten with a given materialized view, and if not to give us the reasons. Let's see a possible syntax below.

SET SERVEROUTPUT ON;

DECLARE

Rewrite_Array SYS.RewriteArrayType := SYS.RewriteArrayType();

querytxt VARCHAR2(4000) := '

select p.year, p.month, sum(a.mtd)

from dim_period p, account_balance a

where a.period_key = p.period_key

group by p.year, p.month

';

no_of_msgs NUMBER;

i NUMBER;

BEGIN

dbms_mview.Explain_Rewrite(querytxt, 'MV_PERIODS',  Rewrite_Array);

no_of_msgs := rewrite_array.count;

FOR i IN 1..no_of_msgs

LOOP

DBMS_OUTPUT.PUT_LINE('>> MV_NAME  : ' || Rewrite_Array(i).mv_name);

DBMS_OUTPUT.PUT_LINE('>> MESSAGE  : ' || Rewrite_Array(i).message);

END LOOP;

END;

(The sections in red indicate which parts of the query you can update; the rest should stay as is).

Once I executed these lines, I got the following result:

>> MV_NAME  : MV_PERIODS

>> MESSAGE  : QSM-01150: query did not rewrite

>> MV_NAME  : MV_PERIODS

>> MESSAGE  : QSM-01029: materialized view, MV_PERIODS, is stale in ENFORCED integrity mode

(Technical note: to see these lines in the Oracle SQL Developer, you need to activate the DBMS output: menu View / DBMS Output and then click on the button 'Enable DMBS Output for the connection)

The line "materialized view, MV_PERIODS, is stale in ENFORCED integrity mode" means that the materialized view is not used because it does not have the right data anymore. So to be able to use the query rewrite process once again, we need to refresh the view with the following syntax:

BEGIN DBMS_SNAPSHOT.REFRESH('MV_PERIODS','C'); end;

Note that in certain situations, the final users may prefer having the data from yesterday in 1 second rather than the data of today in 5 minutes. In that case, choose the STALE_TOLERATED integrity mode (rather than the ENFORCED default) and the query will be rewritten even if the data in the materialized view is not fresh anymore.

 

Extend your materialized views

Now let's imagine that we want to have not only the account sums by periods, but also by company code. Our new SQL query is the following:

(Query 2)

select p.year, p.month, c.company_code, sum(a.mtd)

from dim_period p, account_balance a, dim_company c

where a.period_key = p.period_key

and a.company_key = c.company_key

group by p.year, p.month, c.company_code

Of course the materialized view MV_PERIODS doesn't have the necessary information (company key or company code) and cannot be used to rewrite this query. So let's create another materialized view.

(Query MV-3)

CREATE MATERIALIZED VIEW MV_PERIODS_COMPANIES

BUILD IMMEDIATE

ENABLE QUERY REWRITE

AS

select p.year, p.month, c.company_code, sum(a.mtd)

from dim_period p, account_balance a, dim_company c

where a.period_key = p.period_key

and a.company_key = c.company_key

group by p.year, p.month, c.company_code

So now our query takes a very short time to complete. But what if, after having deleted the MV_PERIODS materialized view, you try to execute the first query (the one without the companies)? The query rewrite mechanism will work as well! Oracle will understand that it can use the content of MV_PERIOD_COMPANIES to calculate the sums quicker.

Be aware that the query will only rewrite if you had created a foreign key relationship between ACCOUNT_BALANCE.COMPANY_KEY and DIM_COMPANY.COMPANY_KEY. Otherwise you'll get the following message:

QSM-01284: materialized view MV_PERIODS_COMPANIES has an anchor table DIM_COMPANY not found in query.

 

Is basing the materialized view on the keys an option?

The materialized views we've created are very interesting but still a bit static. You may ask yourself: wouldn't have it been a better idea to base the materialized view on the keys? For example with the following syntax:

(Query MV-4)

CREATE MATERIALIZED VIEW MV_PERIODS_COMPANIES_keys

BUILD IMMEDIATE

ENABLE QUERY REWRITE

AS

select period_key, company_key, sum(mtd)

from account_balance

group by period_key, company_key

The answer is "it depends". On the good side, this allows for a greater flexibility, as you're not limited to some fields only (as in the query MV-1 where you're limited to year and month). On the bad side, as you're not using any join, the joins will have to be made during the run-time, which has an impact on the performance query (but even then, the query time will be much better than without materialized views).

So if you want a flexible solution because you don't know yet which are the fields that the users will need, it's probably better to use the keys. But if you already know the precise queries which will come (for example for pre-defined reports), it may be worth using the needed fields in the definition of the materialized view rather than the keys.

If you have any doubts or further information on this topic, please leave a comment below.