<img src="https://trc.taboola.com/1278851/log/3/unip?en=page_view" width="0" height="0" style="display:none">
finding octane
Content_Cut_Icon Twitter_Brands_Icon

Integrating transactions logs to web services for PA on AWS using REST API

Mode_Comment_Icon_white0
Alarm_Icon_1_white5 min

In this blog post, we will showcase the process of exposing the transaction logging on Planning Analytics (PA) V12 on AWS to the users. Currently, in Planning Analytics there is no user interface (UI) option to access transaction logs directly from Planning Analytics Workspace. However, there is a workaround to expose transactions to a host server and access the logs. By following these steps, ...

down-arrow-blue
Book_Open_Solid_Icon

In this blog post, we will showcase the process of exposing the transaction logging on Planning Analytics (PA) V12 on AWS to the users. Currently, in Planning Analytics there is no user interface (UI) option to access transaction logs directly from Planning Analytics Workspace. However, there is a workaround to expose transactions to a host server and access the logs. By following these steps, you can successfully access transaction logged in Planning Analytics V12 on AWS using REST API.

integratetranslogs-ezgif.com-optimize

Step 1: Creating an API Key in Planning Analytics Workspace

The first step in this process is to create an API key in Planning Analytics Workspace. An API key is a unique identifier that provides access to the API and allows you to authenticate your requests.

  1. Navigate to the API Key Management Section: In Planning Analytics Workspace, go to the administration section where API keys are managed.
  2. Generate a New API Key: Click on the option to create a new API key. Provide a name and set the necessary permissions for the key.
  3. Save the API Key: Once the key is generated, save it securely. You will need this key for authenticating your requests in the following steps.

Step 2: Authenticating to Planning Analytics As a Service Using the API Key

Once you have the API key, the next step is to authenticate to Planning Analytics as a Service using this key. Authentication verifies your identity and allows you to interact with the Planning Analytics API.

  1. Prepare Your Authentication Request: Use a tool like Postman or any HTTP client to create an authentication request.
  2. Set the Authorization Header: Include the API key in the Authorization header of your request. The header format should be Authorization: Bearer <API Key>.
  3. Send the Authentication Request: Send a request to the Planning Analytics authentication endpoint to obtain an access token.

Detailed instructions for Step 1 and Step 2 can be found in the following IBM technote:

How to Connect to Planning Analytics as a Service Database using REST API with PA API Key

Step 3: Setting Up an HTTP or TCP Server to Collect Transaction Logs

In this step, you will set up a web service that can receive and inspect HTTP or TCP requests to capture transaction logs. This is crucial if you cannot directly access the AWS server or the IBM Planning Analytics logs.

  1. Choose a Web Service Framework: Select a framework like Flask or Django for Python, or any other suitable framework, to create your web service.
  2. Configure the Server: Set up the server to listen for incoming HTTP or TCP requests. Ensure it can parse and store the transaction logs.
  3. Test the Server Locally: Before deploying, test the server locally to ensure it is correctly configured and can handle incoming requests.

For demonstration purposes, we will use a free web service provided by Webhook.site. This service allows you to create a unique URL for receiving and inspecting HTTP requests. It is particularly useful for testing webhooks, APIs, and other HTTP request-based services.

Step 4: Subscribing to the Transaction Logs

The final step involves subscribing to the transaction logs by sending a POST request to Planning Analytics Workspace. This will direct the transaction logs to the web service you set up.

Practical Use Case for Testing IBM Planning Analytics Subscription

Below are the detailed instructions related to Step 4:

  1. Copy the URL Generated from Webhook.site:
    • Visit siteand copy the generated URL (e.g., https://webhook.site/<your-unique-id>). The <your-unique-id> refers to the unique ID found in the "Get" section of the Request Details on the main page.

  1. Subscribe Using Webhook.site URL:
    • Open Postman or any HTTP client.
    • Create a new POST request to the subscription endpoint of Planning Analytics.
    • In Postman, update your subscription to use the Webhook.site URL using the below post request:

  • In the body of the request, paste the URL generated from Webhook.site:

{
 "URL": "https://webhook.site/your-unique-id"
}
<tm1db> is a variable that contains the name of your TM1 database.

Note: Only the transaction log entries created at or after the point of subscription will be sent to the subscriber. To stop the transaction logs, update the POST query by replacing /Subscribe with /Unsubscribe.

By following these steps, you can successfully enable and access transaction logs in Planning Analytics V12 on AWS using REST API.

Leave a comment

Line

Tips on how to manage your Planning Analytics (TM1) effectively

Mode_Comment_Icon_black0
Alarm_Icon_13 min

Effective management of Planning Analytics (TM1), particularly with tools like IBM’s TM1, can significantly enhance your organization’s financial planning and performance management. 

TM1 newsletter

Here are some essential tips to help you optimize your Planning Analytics (TM1) processes:

1. Understand Your Business Needs

Before diving into the technicalities, ensure you have a clear understanding of your business requirements. Identify key performance indicators (KPIs) and metrics that are critical to your organization. This understanding will guide the configuration and customization of your Planning Analytics model.

2. Leverage the Power of TM1 Cubes

TM1 cubes are powerful data structures that enable complex multi-dimensional analysis. Properly designing your cubes is crucial for efficient data retrieval and reporting. Ensure your cubes are optimized for performance by avoiding unnecessary dimensions and carefully planning your cube structure to support your analysis needs.

3. Automate Data Integration

Automating data integration processes can save time and reduce errors. Use ETL (Extract, Transform, Load) tools to automate the extraction of data from various sources, its transformation into the required format, and its loading into TM1. This ensures that your data is always up-to-date and accurate.

4. Implement Robust Security Measures

Data security is paramount, especially when dealing with financial and performance data. Implement robust security measures within your Planning Analytics environment. Use TM1’s security features to control access to data and ensure that only authorized users can view or modify sensitive information.

5. Regularly Review and Optimize Models

Regularly reviewing and optimizing your Planning Analytics models is essential to maintain performance and relevance. Analyze the performance of your TM1 models and identify any bottlenecks or inefficiencies. Periodically update your models to reflect changes in business processes and requirements.

6. Utilize Advanced Analytics and AI

Incorporate advanced analytics and AI capabilities to gain deeper insights from your data. Use predictive analytics to forecast future trends and identify potential risks and opportunities. TM1’s integration with other IBM tools, such as Watson, can enhance your analytics capabilities.

7. Provide Comprehensive Training

Ensure that your team is well-trained in using Planning Analytics and TM1. Comprehensive training will enable users to effectively navigate the system, create accurate reports, and perform sophisticated analyses. Consider regular training sessions to keep the team updated on new features and best practices.

8. Foster Collaboration

Encourage collaboration among different departments within your organization. Planning Analytics can serve as a central platform where various teams can share insights, discuss strategies, and make data-driven decisions. This collaborative approach can lead to more cohesive and effective planning.

9. Monitor and Maintain System Health

Regularly monitor the health of your Planning Analytics environment. Keep an eye on system performance, data accuracy, and user activity. Proactive maintenance can prevent issues before they escalate, ensuring a smooth and uninterrupted operation.

10. Seek Expert Support

Sometimes, managing Planning Analytics and TM1 can be complex and may require expert assistance. Engaging with specialized support services can provide you with the expertise needed to address specific challenges and optimize your system’s performance.

By following these tips, you can effectively manage your Planning Analytics environment and leverage the full potential of TM1 to drive better business outcomes. Remember, continuous improvement and adaptation are key to staying ahead in the ever-evolving landscape of financial planning and analytics.

For specialized TM1 support and expert guidance, consider consulting with professional service providers like Octane Software Solutions. Their expertise can help you navigate the complexities of Planning Analytics, ensuring your system is optimized for peak performance. Book me a meeting

Line

Saying Goodbye to Cognos TM1 10.2.x: Changes in support effective April 30, 2024

Mode_Comment_Icon_black0
Alarm_Icon_12 min

In a recent announcement, IBM unveiled changes to the Continuing Support program for Cognos TM1, impacting users of version 10.2.x. Effective April 30, 2024, Continuing Support for this version will cease to be provided. Let's delve into the details.

blog (1)

What is Continuing Support?

Continuing Support is a lifeline for users of older software versions, offering non-defect support for known issues even after the End of Support (EOS) date. It's akin to an extended warranty, ensuring users can navigate any hiccups they encounter post-EOS. However, for Cognos TM1 version 10.2.x, this safety net will be lifted come April 30, 2024.

What Does This Mean for Users?

Existing customers can continue using their current version of Cognos TM1, but they're encouraged to consider migrating to a newer iteration, specifically Planning Analytics, to maintain support coverage. While users won't be coerced into upgrading, it's essential to recognize the benefits of embracing newer versions, including enhanced performance, streamlined administration, bolstered security, and diverse deployment options like containerization.

How Can Octane Assist in the Transition?

Octane offers a myriad of services to facilitate the transition to Planning Analytics. From assessments and strategic planning to seamless execution, Octane support spans the entire spectrum of the upgrade process. Additionally, for those seeking long-term guidance, Octane  Expertise provides invaluable Support Packages on both the Development and support facets of your TM1 application.

FAQs:

  • Will I be forced to upgrade?

    No, upgrading is not mandatory. Changes are limited to the Continuing Support program, and your entitlements to Cognos TM1 remain unaffected.

  • How much does it cost to upgrade?

    As long as you have active Software Subscription and Support (S&S), there's no additional license cost for migrating to newer versions of Cognos TM1. However, this may be a good time to consider moving to the cloud. 

  • Why should I upgrade?

    Newer versions of Planning Analytics offer many advantages, from improved performance to heightened security, ensuring you stay ahead in today's dynamic business environment. This brings about unnecessary risk to your application.

  • How can Octane help me upgrade?

    Octane’s suite of services caters to every aspect of the upgrade journey, from planning to execution. Whether you need guidance on strategic decision-making or hands-on support during implementation, Octane is here to ensure a seamless transition. Plus we are currently offering a fixed-price option for you to move to the cloud. Find out more here 

In conclusion, while bidding farewell to Cognos TM1 10.2.x may seem daunting, it's also an opportunity to embrace the future with Planning Analytics. Octane stands ready to support users throughout this transition, ensuring continuity, efficiency, and security in their analytics endeavours.

Line

Mastering Calculations in Planning Analytics: Adapt to Changing Months with Ease

Mode_Comment_Icon_black0
Alarm_Icon_16 min

One of the standout features of Planning Analytics Workspace (PAW) is its ability to create calculations in the Exploration view. This feature empowers users to perform advanced calculations without the need for technical expertise. Whether you're using PAW or PAfE (Planning Analytics for Excel), the Exploration view offers a range of powerful capabilities. The Exploration view supports a variety of functions, such as aggregations, mathematical operations, conditional logic, and custom calculations. This means you have the flexibility to perform complex calculations tailored to your specific needs. 

This enables users to create complex financial calculations and business rules within the views, providing more accurate and tailored results for analysis and planning. All this can be done by the business users themselves without relying on IT or development teams, enabling faster and more agile reporting processes. This enables creating ad hoc reports and performing self-service analysis on the fly with a few simple clicks. This self-service capability puts the control in the hands of the users, eliminating the need for lengthy communication processes or waiting for IT teams to fulfill reporting requests.

In this blog post, we will focus on an exciting aspect of the Exploration view: creating MDX-based views that are dynamic and automatically update as your data changes. The beauty of these dynamic views is that users no longer need to manually select members of dimensions to keep their formulas up to date.

Similar to the functionality of dynamic subsets in dimensions, where each click in the set editor automatically generates MDX statements that can be modified, copied, and pasted, the exploration views in Planning Analytics Workspace also generate MDX statements. These MDX statements are created behind the scenes as you interact with the cube view. Just like MDX subsets, these statements can be easily customized, allowing you to fine-tune and adapt them to your specific requirements.

By being able to tweak, copy, and paste these MDX statements, you can easily build upon previous work or share your calculations with others.

Currently, the calculations are not inherently dynamic, however, there are techniques that can be employed to make the calculations adapt to changing time periods.

A classic example we can look at is performing variance analysis on P&L cube where we wish to add a variance formula to show the variance of current month from the previous month. There are many more calculations that we can consider from but we will focus on this analysis in this blog.

If we take our example, the current month and previous month keep changing every month as we roll forward and they are not static. When dealing with changing months or any member in your calculation, it's important to ensure that your calculations remain dynamic and adaptable to those changes. 

To ensure dynamic calculations that reflect changes in months, you have several options to consider:

Manual Approach: You can manually update the column dimensions with the changing months and recreate the calculations each time. However, this method is time-consuming, prone to errors, and not ideal for regular use.

Custom MDX Approach: Another option is to write custom MDX code or modify existing code to reference the months dynamically from a Control cube. While this approach offers flexibility, it can be too technical for end users.

Consolidations Approach: Create consolidations named "Current Month" and "Prior Month" and add the respective months to them as children. Then, use these consolidations in your view and calculations. This approach provides dynamic functionality, but you may need to expand the consolidations to see the specific months, which can be cumbersome.

Alias Attributes Approach: Leverage alias attributes in your MDX calculations. By assigning aliases to the members representing the current and previous months, you can dynamically reference them in your calculations. This approach combines the benefits of the previous methods, providing dynamic calculations, visibility of months, and ease of use without excessive manual adjustments.

In this blog post, we will focus on the alias attributes approach as a recommended method for achieving dynamic calculations in PAW or PAfE. We will guide you step-by-step through the process of utilizing alias attributes to ensure your calculations automatically adapt to changing months. By following this approach, you can simplify your calculations, improve efficiency, and enable non-technical users to perform dynamic variance analysis effortlessly.

To create dynamic calculations for variances between the current and prior month, you can follow these steps:

  • Step 1: Ensure you have an alias attribute available in your Month dimension. If not, create a new alias attribute specifically for this purpose.
  • Step 2: Update the alias with the values "Curr Month" and "Prior Month" for the respective months.
  • Step 3: Open the exploration view in PAW and select the two months (current and prior) on your column or row dimension. 
  • Step 4: Create your variance calculation using the exploration view's calculation capabilities. This could involve subtracting the P&L figures of the prior month from the current month, for example.
  • Step 5: Open the MDX code editor and replace the actual month names in the MDX code with the corresponding alias values you updated in Step 2. You can copy the code in Notepad and use the "Find and Replace" function to make this process faster and more efficient.

ezgif.com-video-to-gif (3)

By replacing the month names with the alias values, you ensure that the calculation remains dynamic and adapts to the changing months without manual intervention. When you update the alias values in the Month dimension, it will reflect in the exploration view. As a result, the months displayed in the view will be dynamically updated based on the alias values. This ensures that your calculations remain synchronized with the changing months without the need for manual adjustments.


Important Note: When selecting the months in set editor, it is crucial to explicitly select and move the individual months from the Available members' pane (left pane) to the Current set pane (right pane). This step is necessary to ensure that unnecessary actions, such as expanding a quarter to select a specific month, are not recorded in the MDX code generated in the exploration view which can potentially lead to issues while replacing the member names with alias values. 

This approach of using alias attributes to make calculations dynamic can be extended to various other calculations in Planning Analytics Workspace. It provides a flexible and user-friendly method to ensure that your calculations automatically adapt to changing dimensions or members.

That being said, it's important to note that there may be certain scenarios where alternative approaches, such as writing custom MDX code or utilizing a control cube, are necessary. Each situation is unique, and the chosen approach should align with the specific requirements and constraints of the calculation, however the proposed approach should still work for a wide variety of calculations in IBM Planning Analytics.

Line

Unlocking the Power of Hierarchies in IBM Planning Analytics

Mode_Comment_Icon_black0
Alarm_Icon_14 min

With the introduction of hierarchies in IBM Planning Analytics, a new level of data analysis capability has been unlocked. This is by far one of the most significant enhancements to the Planning Analytics suite as far as the flexibility and usability of the application is concerned.

IBM Planning Analytics

Benefits of LEAVES Hierarchy

One particular useful hierarchy is the LEAVES hierarchy. It offers several benefits beyond data analysis. 

One that stands out is that it is a “zero-maintenance” hierarchy as it automatically adds leaf level members as they are added in other hierarchies. It can also be used as a master hierarchy to validate and compare nLevel members in all the other hierarchies. Additionally, deleting the member from this hierarchy will delete it from the rest of the hierarchies.

While all hierarchies must be either manually created or automated through the TI process, contrary to the general perception within the PA community where it is maintained that LEAVES hierarchy only gets added when you create a new hierarchy in a dimension, there is, however, a quick and easy way to create the LEAVES hierarchy without creating any other hierarchy in few simple steps. 

}DimensionProperties cube

This is where, where I would like to expose you to a control cube - }DimensionProperties. In this cube you will find quite a few properties that you can play around with. Two properties to focus in this blog are “ALLLEAVESHIERARCHYNAME” and “VISIBILITY”. 

Creating LEAVES hierarchy

By default, the value for ALLLEAVESHIERARCHYNAME in the control cube is blank, however, entering any name in that cell against a corresponding dimension will automatically create a LEAVES hierarchy with that name. 

ezgif.com-video-to-gif

Once done, the Database Tree must be refreshed to see the leaves hierarchy reflecting under the dimension.

This way you can quite easily create the LEAVES hierarchy for any number of dimensions by updating the values in }DimensionProperties cube.

Caution: If you overwrite the name in the control cube, the LEAVES hierarchy name is updated with the new name in the Database Tree and if your old LEAVES hierarchy is referenced in any rules, process or views and subsets, they will no longer work. However, once you restore the original name in the control cube, it will start working. This risk can be mitigated by using a consistent naming convention across the model.

Note that the old hierarchy will still remain in the ‘}Dimensions’ dimension and changing the name does not automatically delete the old hierarchy member.

ezgif.com-optimize

Toggling Hierarchies

In addition to creating LEAVES hierarchy using a few simple steps, you can also use the }DimensionProperties cube to hide or unhide any hierarchy you have created. This capability is useful if there are many hierarchies that have been created but only a selected few needs to be exposed to the users. If any of the hierarchy is not yet updated and is still in WIP state, it can be hidden until the changes are finalized. This gives more control to the administrators or power users to hide or unhide whichever hierarchy they like to show.

To hide any hierarchy, enter the value NO against the “Visibility” property in the control cube. Once the Database Tree is refreshed, that hierarchy will no longer be visible under the dimension. This property is also blank by default.

ezgif.com-optimize (1)

If a view contains a hierarchy and the VISIBILITY property of that hierarchy is set to NO, while the view still opens, opening the subset editor will throw an error.

Note, to unhide the hierarchy, delete the value or enter YES and refresh the Database Tree.

In conclusion, once you understand the benefits and take into account the potential pitfalls of updating the properties, using this capability would greatly enhance the overall usability and maintainability of the application. 

Line

DYNAMIZING DYNAMIC REPORTS: A Hack to Make Columns as Dynamic as Rows

Mode_Comment_Icon_black0
Alarm_Icon_14 min

If you’re tired of manually updating your reports every time you need to add a new column in your Dynamic Reports, you're not alone. It can be time-consuming and tedious - not to mention frustrating - to have to constantly tweak and adjust your reports as your data changes. Luckily, there’s a way to make your life easier: Dynamizing Dynamic Reports. By using a hack to make your reports’ columns as dynamic as the rows, you can free up time and energy for other tasks - and make sure your reports are always up-to-date. Read on to learn how to make your reports more dynamic and efficient!

The Good

Dynamic Reports in PAfE is highly popular and primarily used due to its intrinsic characteristic of being dynamic. The great thing about this report and one of the big reasons for its wide adoption is that the row content in this report updates dynamically, either depending on the subset used or the mdx expression declared within the TM1RptRow function and also because the formulas in the entire TM1RPTDATARNG are dictated by simply updating them in the master row (first row of data range) and it cascades down automatically, including how the formats of the reports are dynamically applied in the report.

The Bad

That being said, with all those amazing capabilities, there is however one big limitation of this report and that is that, unlike rows, the columns are still static and require the report builder to manually insert the elements and the formulas across the columns, thereby making it “not so dynamic” as you would otherwise expect, in that context.

Purpose of this blog

And it is precisely this limitation that this blog aims to address and provide you with a workaround to this problem and make the columns as dynamic as the rows, thus substantiating the title of the blog “Dynamizing the Dynamic Report”.

Method

In order to achieve the dynamism, I have primarily used a combination of 4 functions; 3, Excel 365 and 1, PAfE Worksheet function and they are as follows:

  1. BYCOL - processes data in an array or range and leverages LAMBDA function as an argument to each column in the array to return one result per column as a single array

  2. LAMBDA - a UDF (user defined function) helps to create generic custom functions in Excel that can reused by either embedding it as an argument in another LAMBDA supported function (such as BYCOL) or a function of its own when ported as a named range

  3. TRANSPOSE - Dynamic Array function to transpose the row or column array

  4. TM1ELLIST - Only PAfE worksheet function that returns an array of values from a dimension subset, static list or MDX expression

Instructions

Let's have a look now at how we have utilized these functions within the Dynamic Report.

The above image is a Dynamic Report showing the data from the Benefits Assumptions cube having 3 dimensions; Year, Version, and Benefit.

The Benefit dimension is across rows, Year across columns, and Version on the title.

In cell C17, I used the TM1Ellist function to get the Year members (Y1, Y2, Y3) from a subset named “Custom Years” returning it as a range and then wrapping it inside the TRANSPOSE function to transpose the resultant range.

Cell C17 formula:

In cell C18, instead of DBRW, I used the BYCOL function where I used the range in cell C17 by prefixing it with spilled reference (#) as the first argument of it.

I then used the LAMBDA function to create a custom function as its second argument where I declared a variable x and passed it inside the DBRW formula in the position of the Year dimension.

So the way the formula would work is, it would take the output from TM1ELLIST function and pass each member of it in LAMBDA function as variable x which is then passed within DBRW formula, making it a dynamic range that automatically resizes based on the output of TM1ELLIST function.

Cell C18 formula: 

Lightbulb_Solid_Icon

Note that the formula is only entered in one cell (C18) and it spills across both rows and columns.

Caveats

  1. This is only supported in PAfE which means it won’t work in PAW or TM1Web

  2. Works in Excel that supports Dynamic Array and LAMBDA functions

  3. The formatting is not spilled

 

Line

Exploring the Latest Enhancements of IBM Planning Analytics Components

Mode_Comment_Icon_black0
Alarm_Icon_13 min

As the world moves towards more data-driven decision-making, businesses are increasingly looking for effective planning and budgeting solutions. IBM Planning Analytics is the go-to for businesses looking for a comprehensive set of tools to help them manage their budgeting and planning process.

Slide2

With Planning Analytics, businesses can access powerful analytics to make more informed decisions, leverage advanced features to create complex models, and gain better insights into their financial data.

IBM is constantly improving the functionalities and features of the IBM Planning Analytics components. This includes Planning Analytics Workspace (PAW), Planning Analytics for Excel (PAfX), and Planning Analytics with Watson. With these updates, businesses can take advantage of new features to help them manage their budgeting and planning process more effectively.

In the last 12 months, IBM has released several updates to its Planning Analytics components.

In PAW, users can now access advanced analytics such as forecast simulations, predictive models, and scenario analysis. They can also perform in-depth analysis on their data with the new Visual Explorer feature. In addition, users can now access a library of planning and budgeting models, which can be customized to fit the needs of their organization. (download PDF file to get the full details)

Slide3download PDF file to get the full details

 

Slide6download PDF file to get the full details

In PAfX, users can now access advanced features such as SmartViews and SmartCharts. SmartViews allows users to visualize their data in various ways, while SmartCharts allows users to create interactive charts and graphs. Users can also take advantage of the new custom formatting options to make their reports look more professional.

Slide7download PDF file to get the full details

 

Slide8download PDF file to get the full details

Finally, with Planning Analytics with Watson, users can access powerful AI-driven insights. This includes AI-driven forecasting, which allows users to create more accurate forecasts. In addition, Watson can provide insights into the drivers of their business, allowing users to make more informed decisions.

 

Slide9download PDF file to get the full details

 

Overall, IBM’s updates to the Planning Analytics components provide businesses with powerful tools to help them manage their budgeting and planning process. With these updates, businesses can take advantage of the latest features to quickly access data-driven insights, create more accurate forecasts, and gain better insights into their financial data.

Download the PDF file below to get the full version of each IBM Planning Analytics components.

Line

Top 12 Planning Analytics features that you should be using in 2023

Mode_Comment_Icon_black0
Alarm_Icon_18 min

Amin Mohammad, the IBM Planning Analytics Practice Lead at Octane Solutions, is taking you through his top 12 capabilities of Planning Analytics, in 2023. These are his personal favorites and there could be more than what he is covering.

Top 12 picks of Planning Analytics

He has decided to divide his list into PAFe and PAW, as they have their own unique capabilities, and to highlight them separately. 

Planning Analytics for Excel (PAfE)

1. Support for alternate hierarchies in TM1 Web and PAfE

Starting with TM1 Set function, which has finally opened the option to use alternate hierarchies in TM1 web. it contains nine arguments as opposed to the four in SubNM adding to its flexibility. It also supports MDX expressions as one of the arguments. This function can be used as a good replacement for SubNM.

2. Updated look for cube viewer and set editor

The Planning Analytics Workspace and Cognos Analytics have taken the extra step to provide a consistent user experience. This includes the incorporation of the Carbon Design Principles, which have been implemented in the Set Editor and cube viewer n PaFe. This allows users to enjoy an enhanced look and feel of certain components within the software, as well as improved capabilities. This is an excellent addition that makes the most out of the user experience.

3. Creating User Define Calculations (UDC)

Hands down, the User Defined Calculations is by far the most impressive capability added recently. This capability allows you to create custom calculations using the Define calc function in PAFe, which also works in TM1 Web. With this, you can easily perform various calculations such as consolidating data based on a few selected elements, performing arithmetic calculations on your data, etc. Before this capability, we had to create custom consolidation elements in the dimension itself to achieve these results in PAfE, leading to multiple consolidated elements within the dimension, making it very convoluted. Tthe only downside is that it can be a bit technical for some users who use this, making it a barrier to mass adoption. Additionally, the sCalcMun argument within this function is case-sensitive, so bear that in mind. Hoping this issue is fixed in future releases.

4. Version Control utility

The Version Control utility helps to validate whether the version of Pathway you are using is compatible with the data source version of Planning Analytics Logo. If the two versions are not compatible, you cannot use Pathway until you update the software. The Version Control uses three capability or compatibility types to highlight the status of the compatibility:

  • normal
  • warning
  • blocked

Administrators can also configure the Version Control to download a specific version of Pathway when the update button is clicked, helping to ensure the right version of Pathway is used across your organization.

Planning Analytics Workspace (PAW)

5. Single Cell widget

Planning Analytics Workspace has recently added the Single Cell widget as a visualization, making it easier to update dimension filters. Before this, the Single Cell widget could be added by right-clicking a particular data point, but it had its limitations. 

One limitation that has been addressed is the inability to update dimension filters in the canvas once the widget has been added. In order to update it, one has to redo all steps, but the single widget visualization has changed this. Now, users can change the filters and the widget will update the data accordingly. This has been a great improvement as far as enhancing user experience goes.

Additionally, the widget can be transformed into any other visualization and vice versa. When adding the widget, the data point that was selected at that point is reflected in it. If nothing is selected, the top left of the first data point in the view is used to create the widget.Single cell widget

 

6. Sending email notifications to Contributors

You can now easily send email notifications to contributors with the click of a button from the Contribution Panel of the Overview Report. When you click the button, it sends out an email to the members of the group that has been assigned the task. The email option is only activated when the status is either pending approval or pending submission. Clicking the icon will send the email to all the members assigned to the group for the task.Email notification to contributors

7. Add task dependencies

Now, you can add task dependencies to plans, which allows you to control the order in which tasks can be completed. For example, if there are two tasks and Task Two is dependent on Task One, Task Two cannot be opened until Task One is completed. This feature forces users to do the right thing by opening the relevant task and prevents other tasks from being opened until the prerequisite task is completed. This way, users are forced to follow the workflow and proceed in the right order.

8. Approval and Rejections in Plans with email notifications

The email notifications meintioned here are not manually triggered like the ones in the 6th top picks. These emails are fully automated and event-based. The events that trigger these emails could be opening a plan step, submitting a step, or approving or rejecting a step. The emails that are sent out will have a link taking the user directly to the plan step in question, making the planning process easier for the users to follow.

light bulb

"The worklow capabilities of the Planning Analytics Workspace have seen immense improvements over time. It initially served as a framework to establish workflows, however, now it has become a fully matured workflow component with many added capabilities. This allows for a more robust and comprehensive environment for users, making it easier to complete tasks."

9. URL to access the PAW folder

PAW (Planning Analytics Workspace) now offers the capability to share links to a folder within the workspace. This applies to all folders, including the Personal, Favorites, and Recent tabs. This is great because it makes it easier for users to share information, and also makes the navigation process simpler. All around, this is a good addition and definitely makes life easier for the users.

10. Email books or views

The administrator can now configure the system to send emails containing books or views from Planning Analytics Workspace. Previously, the only way to share books or views was to export them into certain formats. However, by enabling the email functionality, users are now able to send books or views through email. Once configured, an 'email' tab will become available when viewing a book, allowing users to quickly and easily share their content. This option was not previously available.

11. Upload files to PA database​

Workspace now allows you to upload files to the Planning Analytics database. This can be done manually using the File Manager, which is found in the Workbench, or through a TI process. IBM has come up with a new property within the action button that enables you to upload the file when running the TI process. Once the file is uploaded, it can be used in the TI process to load data into TM1. This way, users do not have to save the file in a shared location and can simply upload it from their local desktop and load the data. This is a handy new functionality that IBM has added. Bear in mind that the file cannot be run until it has been successfully uploaded, so if the file is large, it may take time.

12. Custom themes​

Finally, improvements in custom themes. Having the ability to create your own custom themes is incredibly helpful in order to align the coloring of your reports to match your corporate design. This removes the limitation of only being able to use pre-built colors and themes, and instead allows you to customize it to your specific requirements. This gives you the direct functionality needed to make it feel like your own website when any user opens it.

That's all I have for now. I hope you found these capabilities insightful and worth exploring further.

If you want to see the full details of this blogpost. Click here

Line

IBM TM1 Cognos user experience white paper 2022

Mode_Comment_Icon_black0
Alarm_Icon_11 min

This white paper was written to describe the level of self-service users can expect from TM1. Download the full white paper in PDF below.

GettyImages-834792042-45c3b3434d91481b9089392fa33b66f2

 

This white paper was written to describe the level of self-service users can expect from TM1.

To understand what users can expect from TM1, imagine your organisation’s finance office is a house. In this house, users are busy building different kinds of rooms. Popular rooms include budgeting and forecasting. In an adjacent room, analysis of financials is taking place. Down the hallway a report is being created.

In addition to the rooms, there is furniture and of course, a foundation. The distinction between the rooms and the foundations is important. Without a strong foundation, the rooms will collapse upon themselves. Adding new rooms would be made difficult, as the added weight may not be supported by the structure.

While users build and carry out activities in the rooms, it is the foundations where the engineers or better still, TM1 developers, work. TM1 developers are busy ensuring there is no unnecessary duplication in the foundations. We call this governance. Developers also ensure the foundations are built with integrity so TM1 is efficient and runs lean.

To keep your instance of TM1 running, you will need both users and developers.

Line

Data Governance IBM Cognos

Mode_Comment_Icon_black1
Alarm_Icon_13 min

Balancing Flexibility and Standards - IBM Cognos TM1 Govern Data Discovery

The world of data has infinite variety…and infinite requirements. No single analytics solution will please everyone. But it is possible to combine the strengths of complementary solutions to help meet the needs of both enterprise IT personnel who crave data governance and the end business users who are hungry for self-service. IBM offers different flavors of analytics to bring you the best side-by-side experience.

In today’s blog, I would like to focus on IBM Cognos TM1 Planning Analytics - a unique union of data governance reporting and dashboards coupled with smart data discovery that just about anyone can use. With self-service, cognitive insight, visualisation, data governance, reporting and sharing, this blended analytics solution is bigger than the sum of its parts.

 

IBM Cognos TM1 Planning Analytics

IBM has completely redesigned its IBM Cognos BI products to focus on business users and strike a better balance between governance and discovery. The recent solution is and all-Web product that runs either in the cloud or on-premises and provides a more intuitive and guided user interface that makes it easy for business users to interact with and create content.

One of the striking features of IBM Cognos TM1 Planning Analytics is its personalisation and guided analytics tools. The product provides a single interface that gracefully exposes functionality as users evolve from content consumption to creation. Authorised users can open a report, dashboard or model, make changes and store the result for personal use or share it with others.

When using TM1 for reporting /dashboarding customers can be confident that data is validated and can be audited back to the source. Even in a distributed enterprise model TM1 is usually architected to be the single source of truth.

IBM-Cognos-TM1-Planning-Analytics

 

Intent- Driven Authoring

Consumers can not only interact with predefined reports and dashboards, but they can create new content using “intent-driven” authoring environment that uses a search interface to automatically generate visualisations based on keywords entered by users. Unlike other self-service BI tools, users don’t have to drag and drop metrics and dimensions onto chart axes to configure visualisation; IBM Cognos TM1 Planning Analytics does this automatically just from the text users type into a search bar.

 

Intent-Driven Modeling 

Besides auto-generating visualisations, IBM Cognos Analytics uses search to auto-generate modeled data sets that blend data from multiple sources. Called intent-driven modeling, the new feature enables IBM Cognos Analytics users to type in keywords to generate a data model for a new data set. Intent-driven modeling takes ease of use and self service to an extreme, enabling casual users to create new data sets from which they can build custom visualisations using intent-driven authoring.

Another noteworthy feature of IBM Cognos TM1 Planning Analytics is on-demand toolbars and menus in report authoring mode, which pop up as users click on data, exposing available functions, such as filter, sort, calculator and so on. Rather than overwhelm users with features and functions, IBM uses toolbars to put BI functionality at users’ fingertips simplifying the BI experience and making it easier to navigate and interact with data.

 

Conclusion

IBM has been a mainstay of the BI market since it acquired Cognos in 2008. With IBM Cognos TM 1 Planning Analytics, IBM is determined to lead the market in ease of use and guided analytics. New self-service features make it easier for business professionals to create data sets and reports for personal and shared consumption.

 

Line

Octane Software Solution Partner With QUBEdocs to Deliver Cutting Edge

Mode_Comment_Icon_black0
Alarm_Icon_13 min

Octane Software Solutions is a cutting edge technology and services provider to the Office of Finance. Octane is partnered with vendors like IBM and BlackLine to provide AI-based solutions to help finance teams automate their processes and increase their ability to provide business value to the enterprise.

Qubedocs is an automated IBM Planning Analytics documenter. It generates automated documentation within minutes and ensures compliance and knowledge management within your organisation. So, we're excited to announce our partnership with QUBEdocs - a solution that takes the resources and headaches out of TM1 modelling. In this article, we discuss common challenges with Planning Analytics and how QUBEdocs transforms this process.

Challenges with Planning Analytics (TM1)

Our experience in the industry has meant we've worked with many enterprises that encounter challenges with Planning Analytics. Common concerns and challenges that our clients face are listed here:

  • Correct documentation
  • Over-reliance on developers, which leaves businesses vulnerable.
  • Unable to visualise the full model, resulting in not understanding the information and misinterpreting the model.
  • Are business rules working correctly?
  • Understanding data cubes
  • Disaster recovery and causation analysis
  • Managing audit
  • Compliance with IBM licence rules

Reading through these challenges can paint the picture of a complicated process to manage and support. They cover a broad range of concerns, from first ensuring the documentation is correct, understanding the data and information, and not knowing if they're doing it right. Automating this process can take the guesswork and lack of confidence out of the models.

How QUBEdocs transforms the process

We've partnered with QUBEdocs because of its capabilities to transform the TM1 Models. Through QUBEdocs you can generate custom documentation in minutes (as opposed to months) for your IBM Planning Analytics TM1. You're able to meet your regulatory requirements, capture company-wide knowledge and gain an accurate, updated view of TM1 model dependencies.

Below is a list of benefits that QUBEdocs offers:

Purpose-built

Specifically built for business intelligence, QUBEdocs allows seamless integration with IBM Planning Analytics.

Fully automated documentation

QUBEdocs focuses on driving business value while documenting every single detail. Automating the documentation takes the errors out of the process and ensures your plans are knowledge-driven.

Personalised reporting

QUBEdocs keeps track of all the layers of data that are important to you – choose from standard reporting templates or customise what you want to see.

Compare models

Compare different versions of your model to gain complete visibility and pinpoint changes and potential vulnerabilities.

Cloud-based

QUBEdocs up-to-date features and functionalities need no infrastructure to use and allows collaborative, remote working.

Data with context

Context is critical to data-driven decisions. Every result in QUBEdocs is supported by context, so you understand before you act.

Model analysis 

Models offer a way to look at your applications, objects or relationships in-depth. Analysing your models can help you understand your complex models intuitively, so you know each part of your business and what it needs to succeed.

Dashboards 

Understand your server environment at a glance with key metrics tailored for different stakeholders in your business.

Summary

This article has outlined the benefits of QUBEdocs, and why we're excited to announce our partnership. Though, when you work with Octane Software Solutions, you get a company in it for the long haul until you've grown into your new wings. If QUBEdocs is right for you, a big part of our processes is implementing it into your organisation so that it's fully enabled to improve your business performance. 

Learn more about QUBEdocs or join our upcoming webinar; How to automate your Planning Analytics (TM1) documentation.

Line

Planning Analytics with Watson (TM1) Training made easy

Mode_Comment_Icon_black1
Alarm_Icon_14 min

We have made it easier for your users to access Planning Analytics with Watson (TM1) Training.

 

This week we launched our online training for Planning Analytics with Watson PAW and PAX; available online in instructor-led, 4-hour sessions. 

 

Planning Analytics with Watson (TM1) Training

This training provides the ideal time for you to spend some of your allocated training budgets; often assigned but never utilised on something that you can actually apply in your workplace. We have made it easy for you to book your training online in a few easy steps.

IBM has been consistently improving and adding new features to PAW and PAX. To maximise your training outcome, we will run the training on the latest (or very close to the latest) release of PAW and PAX; which will give you a good insight into what new features are available. Our training will speed up your understanding of the new features and help you decide on your upgrade decisions. The best part of our training offering is that we have priced it at only $99 AUD – this is a great value.

Being interactive instructor-led TM1 training, you would be able to ask questions and get clarifications in real-time. Attending this training will ensure that you and your staff are up-to-date with the latest versions and functionalities.

 

Training outcomes

Having your users trained up will mean that you can utilise your Planning Analytics with Watson (TM1) application to its full potential. Users would be able to self service their analytics and reporting. They would also be logging a reduced number of tickets as they understand how to use the system. Engagement would go up as they actively participate in providing feedback on your model's evolution. Overall, you should expect to see an increase in productivity from your users.

 

PAW Training Overview
  • Introduction of PA and workspace
  • Welcome page
  • Creating books
  • Creating views
  • Hiding row or columns/rows and columns in views
  • Snap commands
  • Selector widget
  • Synchronising objects in a book or sheet
  • Adding navigation button to sheet
  • Dataset export
  • Visualisations
  • Creating metric visualisations
  • Add text box
  • Work with images
  • End-user calculations
  • Using MDX based subsets
PAX Training Overview
  • Introduction to PAX
  • Overview and list components
  • Setup IBM connection, connecting data source, open workbook
  • Working with data and reports
  • Clear cell content
  • Convert dynamic data to snapshots
  • Exploration views
  • Lists
  • Quick report
  • Dynamic report
  • Custom report
  • Publish workbooks
  • Sets for TM1
  • IBM TM1 functions
  • Cube viewer
  • Action buttons

 

Training delivery

The training course will be delivered online by Octane senior consultants, who have more than 10-15 years of delivery experience. The class size is limited to only 12 attendees to ensure all attendees get enough focus. 

The training sessions are scheduled in multiple, so you should be able to find a slot that is suitable for you

 

Have you got any questions?

We have captured most of the questions we've been asked on this FAQ page

I look forward to seeing you at training. 

Line

What's in a name? Watson in the name!

Mode_Comment_Icon_black0
Alarm_Icon_12 min

Starting 1 April 2021, "with Watson" will be added to the name of the IBM Planning Analytics solution.

IBM® Planning Analytics with Watson will be the official product name represented on the IBM website, in the product login and documentation, as well as in marketing collateral. However, IBM TM1® text will be maintained in descriptions of planning analytics' capabilities, differentiators and benefits.

 

What is the "Watson" in Planning Analytics with Watson?

The cognitive help feature within Planning Analytics with Watson is the help system used in IBM Planning Analytics Workspace (Cloud). This feature uses machine learning and natural language processing to drive clients towards better content that is more tailored to the user's needs. As clients interact with the help system, the system creates a content profile of the content they are viewing and what they are searching for.

 

Branding benefits of the name

  • Utilize the IBM Watson® brand, a leader in the technology and enterprise space, to gain a competitive advantage
  • AI and predictive as differentiators to how we approach planning
  • Amplify the reach of planning analytics to our target audience and analysts through Watson marketing activities

 

What do we think?

We are pleased to note that the name TM1 remains with the product. The Planning analytics product has evolved significantly from the early days of Applix. We had initial apprehension when IBM acquiring TM1 via Cognos acquisition (IBM acquired Cognos in January 2008 for USD $4.9 Billion). We naturally assumed that this little gem of a product would be lost in the vast portfolio of IBM software.

However, it's quite pleasing to see TM1 thrive under IBM. It received significant R&D funding and made TM1 into an enterprise planning tool. We saw the development of the workspace, which brought in the modern dashboard and reporting features. Move to Pax saw us get an even better excel interface and, just lately, the workspace feature that manages a complex enterprise workflow.

The biggest gamechanger was making Planning Analytics available as a Software as a Service (you can still get it as an on-premise solution). This meant that the time to deploy was reduced to a couple of days. There is no cost to the business in maintaining the application in doing any patches and upgrades. Gone are the days of IT and Finance at loggerheads over the application. The stability and speed of Planning Analytics as a SaaS product has pleasantly surprised even us believers!

Adding Watson to the name is timely as AI-infused features around predictive forecasting is getting more prevalent. There is no doubt that IBM Planning Analytics with Watson is the most powerful AI-based Planning tool available. It's time to acknowledge the future of where we are going.

What do you think of the name change? Share with us your thoughts.

 

Line

Planning Analytics Audit log – Little known pitfall

Mode_Comment_Icon_black0
Alarm_Icon_12 min

The blogs brief about the challenge faced post enabling the Audit log in one of our client's environment. Once the audit log was turned on to capture the metadata changes, the Data Directory backup scheduled process started to fail.

After some investigation, I found the cause was the temp file (i.e., tm1rawstore.<TimeStamp> ) generated by the audit log by default and placed in the data directory.

The Temp file is used by audit log to record the events before moving it to a permanent file (i.e., tm1auditstore<TimeStamp>). Sometimes, you may even notice dimension related files (i.e., DimensionName.dim.<Timestamp>), and these files are generated by audit log to capture the dimension related changes.

The RawStoreDirectory is a tm1.cfg parameter related to the audit log, which helped us resolve the issue. This parameter is used to define the folder path for temporary, unprocessed log files specific to the audit log, i.e., tm1rawstore.<TimeStamp>, DimensionName.dim.<Timestamp>. If this Config is not set, then by default, these files get placed in Data Directory.

RawStoreDirectory = <Folderpath>

 

Now, let's also see other config parameters related to the audit logs

 

AuditLogMaxFileSize:

The config parameter can be used to control the maximum size audit log file to be before the file gets saved and a new file is created. The unit needs to be appended at the end of the value defined ( KB, MB, GB), and Minimum is 1KB and Maximum is 2GB; if this is not specified in the TM1 Cfg then the default value would be 100 MB.

AuditLogMaxFileSize=100 MB

 

AuditLogMaxQueryMemory:

The config parameter can be used to control maximum memory the TM1 server can use for running audit log query and retrieving the set. The unit needs to be appended at the end of the value defined ( KB, MB, GB) and Minimum is 1KB and Maximum is 2GB; if this is not specified in the TM1 Cfg then the default value would be 100 MB.

AuditLogMaxQueryMemory=200 MB


AuditLogUpdateInterval:

The config parameter can be used to control the amount of time the TM1 server needs to wait before moving the contents from temporary files to a final audit log file. The value is taken in minutes; that is, say 100 is entered, then it is taken has 100 minutes.

AuditLogUpdateInterval=100

 

That's it folks, hope you had learnt something new from this blog.

Line

Data Analysis using Dynamic Array formulas

Mode_Comment_Icon_black0
Alarm_Icon_16 min

How to create reports using dynamic array formulas in Planning Analytics TM1

 

In our previous blog (https://blog.octanesolutions.com.au/what-are-dynamic-array-formulas-and-why-you-should-use-them), we discussed about Dynamic Array formulas and highlighted the key reasons and advantages to start using DA formulas.

In this blog, we will try to create a few intuitive reports based on custom reports built on PAfE. The data set we will be using is shows the employee details in “Employee” cube with the following dimensionality:

 

Dimension Measure
Year  Department
Version Name/Desc
Sr.No Current Salary
Organisation Joining Date
Measure  

 

 

Below is the screenshot of my PA data that I will be using for this blog:

 

Integration-Data

 

 

For ease of formula entry, I’ve created a named range for column B to F.

 

Integration-Data

 

Now that we’ve set the base, lets start off with generating some useful insights with our dataset.

  1. Get the employees with top/bottom 3 salaries
  2. Sum data based on date range
  3. Create searchable drop down list

 

Integration-Data

 

 

Formula in cell J22 is as below:

 

Integration-Data

 

I will try to breakdown the formula to explain in simple language:

We used Filter function which is a DA formula. The Excel FILTER function filters a range of data based on supplied criteria, and extracts matching records. It works in a similar way to VLOOKUP except that VLOOKUP returns a single value, whereas Filter returns one or more values that qualify a criteria. Filter takes three arguments; Array, Include and If_Empty. We passed the employee and salary list as the array in our formula and for inclusion we used a LARGE function (that returns the x largest value in an array where x is a number) and compared it with all the salaries using greater than or equal to operator.

With this criteria, the array is filtered to those employees whose salary is greater than or equal to the 3rd most largest salary.

Similarly, if you wish to filter the employees by 3 lowest salaries. Use the below formula to achieve the same:

 

Integration-Data

 

A very common analysis that is done based on date range is summarising or calculating the average of data between start and end date. So lets see how we can achieve this using the DA formula. The scenario is, the analyst wants to see what is the sum of the salaries paid for all the periods between Jan 2019 till Dec 2019.

Lets first get the list using the Filter function and once we’ve the data, it is very easy to just summarise it.

 

 

Integration-Data

 

Formula in cell H22 is as below:

 

Integration-Data

 

The concept is similar to the previous one where we’re getting a list of employees with their salaries and joining dates, based on a set condition. Here we’re using AND condition to filter the data based on two date ranges, where joining date of employees is greater than or equal to Date From and less than or equal to To Date. We had to use the NUMBERVALUE function to convert the date that is stored as string data in Planning Analytics to numeric value for doing logical comparison.

Now that we know we can apply the same condition within the Filter function that only returns the Salary and wrap it up inside the SUM function to summarise the salaries.

 

Integration-Data

 

Formula in cell L19:

 

Integration-Data

 

In PAfE, a SUBNM is used to search and select the elements in a dimension. However, there is currently no provision to filter the list of elements in SUBNM list to only show selected elements basis that matches the text, let alone wild card search. One of the cool things we can do with DA formulas is to be able to create a searchable drop down list.

Lets create a searchable drop down list for the Department now and see how it works.

 

Integration-Data

 

In the screenshot above, I’ve entered letter i in cell H7 which is a Data Validation list in Excel and the drop down lists all the departments which have letter i in it. The actual formula is written in cell I1 and that cell is referenced in the Source field of Data Validation.

 

Integration-Data

 

I’ve used a Hash(#) character in the source to refer to an entire spill range that the formula in I1 returns.

Formula in cell I1:

 

Integration-Data

 

I’ve wrapped a Filter function within a UNIQUE function that is another DA function that returns a unique list of values within an array. The Filter function uses SEARCH function to return a value if a match is found which is then wrapped inside ISNUMBER to return a Boolean value.

Note:: While the example uses custom report, the same named ranges can very well be created in Dynamic report using OFFSET function to do the same so this analysis is not just restricted to sliced report but also Dynamic aka Active Form report.

 

So these are just a few of the super easy and on the fly analysis we can do using DA functions to start with that can take the reporting capabilities of PAfE to a whole new level.

 

Line

Dynamic Array formulas in IBM PA TM1 - Supercharge your Excel report

Mode_Comment_Icon_black1
Alarm_Icon_14 min

Dynamic Array Formulas in IBM PA TM1 (1)

 

What are Dynamic Array formulas and why you should use them?

 

In this blog article (and few other upcoming blogs), I am going to write about the capabilities of Dynamic Array (DA) functions with examples and demonstrate to you some great features it has that I believe can empower the PA Analysts to do all differently sorts of data analysis in a simpler and much more intuitive way, thereby enhancing their productivity.

To start off, lets first understand what Dynamic Array functions actually are?

To put it simply, the DA functions are those functions that leverages Excel’s latest DA calculation behavior where you no more have to enter CSE(Control+Shift+Enter) to spill the formulas or in fact copy pasting the formula for each value you wanted returned to the grid.

With DA you simply enter the formula in one cell and hit enter and it will result in an array of values returned to the gird, also known as spilling.

The Array functions are currently only supported in Office 365 but according to Microsoft, it will be extended other versions soon.

Typically when you enter a formula that may return an Array in Older version of Excel and then open it in DA version of Excel, you would get an @ sign – also know as implicit intersection - before the formula. This is added by Excel automatically for all the formulas that it considers might potentially return an multi-cell ranges. By having this sign, Excel ensures formulas that can return multiple values in DA compatible version would always return just one value and it does not spill.

Following is the information on implicit intersection available on the Microsoft website:

With the advent of dynamic arrays, Excel is no longer limited to returning single values from formulas, so invisible implicit intersection is no longer needed. Where an old Excel formula could invisibly trigger implicit intersection, dynamic array enabled Excel shows where it would have occurred. With the initial release of dynamic arrays, Excel indicated where this occurred by using the SINGLE function. However, based on user feedback, we’ve moved to a more succinct notation: the @ operator.

Note: According to Microsoft, this shouldn’t impact the current formulas, however few Planning Analytics clients have already complained of issues having @ in DBRW formulas in PAfE where it no more works. The @ sign that had to be manually removed from all DBRW formulas to make it work. This is a bit of bummer because depending on the number of reports, it may lead to significant amount of work to do this task, a VBA might be a of relief here otherwise it a bit of tedious task.

More on Implicit Intersection can be found in below link:

https://support.microsoft.com/en-us/office/implicit-intersection-operator-ce3be07b-0101-4450-a24e-c1c999be2b34?ui=en-us&rs=en-us&ad=us

Additionally, there is another key update that must be made in Excel setting to address another bizarre side effect of implicit intersection observed in Dynamic reports. See below link for details:

https://www.ibm.com/support/pages/unable-expandcollapse-tm1rptrow-dynamic-report-shows-mail-icon

Below are the list of DA formulas currently available in Excel 365.

FILTER

RANDARRAY

SEQUENCE

SORT

SORTBY

UNIQUE

I will be covering off a bit more in detail on these functions in my subsequent blogs to showcase real power of these functions so hang in there till then.

As for why you need to use them, below are some of the reasons I’ve listed to bring on the table:

1. It compliments the PAfE capabilities and fills the gaps where PAfE could not due to its limitations

2. Can open up the world of new data analysis capabilities

3. Once you understand the formulas and the Boolean concept (which is not complicated by any means), It’s true potential could be realised

4. It is simple yet very powerful and a big time-saver

5. With formula sitting only in one cell, it is less error prone

6. The calculation performance is super-fast

7. CSE no more!

8. It is backward compatible, meaning you need not worry how your DA results appear in legacy excel as long as you’re not using the DA fun

9. Updates are easy to make – need to only update in once cell as opposed to all cells

This is my value proposition for why you should use DA formulas. I’ve so far not yet demonstrated what I proposed which I intend to do in my later blogs, till then thanks for reading folks and stay safe.

 link to youtube

 

Line

Octane Celebrates 4th Anniversary

Mode_Comment_Icon_black0
Alarm_Icon_16 min

2020 has been an interesting year for us !

 

204th 20annivesary

 

Image 3

Celebrations

This month Octane celebrates our 4th anniversary. Never imagined that we would be celebrating with our team members across different geographies via online gifts and Teams meetings. Normally we fly in all our team members to one location for the weekend and have a great time bonding. However with the global pandemic we had to adapt as the rest of the world is.

The journey so far

On the whole, reflecting back on the journey so far on our anniversary 2020 certainly has thrown in an riveting challenge. Having started from a small shared office in North of Sydney, Octane today has 7 offices and operates in multiple countries. We were helping some of the largest and diverse enterprises around the world get greater value out of their Planning Analytics applications. Travel to client sites in different cities was always been my favourite job perks. As we were getting into the grips of pandemic in Feb/march we were on a trip to Dubai and Mumbai meeting clients and staff. There was a bit of concern in the air but none of us had any idea that travel would come to a standstill. We suddenly found ourselves coordinating with our staff, arranging for their travel safely back to their homes; Navigating the multiple quarantine regimes of different countries and fighting for seats on limited flights.

 

 

Octane 4th annivesary Blog-1
Octane Team at client site in Dubai

 

Blog
Dubai Mall - One of our clients

 

20190908_111932-1
One of the team outings in Delhi

We had a team of consultants assisting Fiji Airways with their Finance Transformation. The travel restrictions and a volatile economy meant that Fiji Airways had to swiftly change gears and make use of our team to assist in business scenario planning and modelling. This was a exemplary example  of how a organisation reacted quickly and adopted a distinct business model to face the challenges. Thankfully their platform supported Scenario Modelling and could handle What-if Analysis handling at scale with ease (same cannot be said for some of the other enterprises that had to resort to excel or burn midnight oil to provide the right insights to the business - This is why we love Planning Analytics TM1!)

 

Blog-3
Fiji Airways (our client) on Tarmac at Nadi Airport

 

204th-20annivesary
Sameer Syed at the Welcoming ceremony of Fiji Airways A350 aircraft

The Silver Lining for Octane

With Pandemic came a rapid rethink of the business model for most organisations. We at Octane were already geared up to provide remote development and support of TM1 applications. With our offshore centres and built-in economies of scale, we were in a position to reduce the overall cost of providing development and support. This gained a lot more traction as organisations started evaluating their costs and realised we are able to provide better quality of services at a lower cost without a hitch. We internally tweaked our model to reduce barriers of entry for companies wishing to take up our Managed Service options. We already had a 24/7 Support structure in place which meant that we could provide uninterrupted service to any client anywhere in the world in their time zones.

Within Octane we were also operating in a crisis mode with daily Management calls. Ensuring safety and well-being of staff was our first priority as different countries and cities brought in the lockdowns. We remained agile and forged tactical plans with clients to ensure there were minimal disruptions to their business. Working from home was the new normal. We already had all the technology to support this and specialise in remote support so this was an fairly easy exercise for us.From the lows in May, Slowly but steadily our business model started to gain traction as we focused on client service and not profits.

Growth Plans and Announcements

In the chaos of 2020 it was also important to us to continue with our growth plans. We had to tweak our strategy and put opening new offices on hold in some countries.Travel restrictions and move to a new business model by clients meant we did not need to be present in their offices.

One major announcement is that Octane has signed up as a business partner Blackline. Blackline is a fast growing Financial Close and Finance Automation System. It fits in well with our current offering to the office of Finance and operations.

The other significant milestone was the launch of DataFusion. This is a connector developed in-house to connect  Planning Analytics TM1 to PowerBI, Tableau or Qlik seamlessly. These are some of the common reporting tools and typically require manual data uploads. This leads to reconciliation issues and untimeliness of reporting data. This has resonated very well with the TM1 community.

We also have a number of vendors who are discussing partnership opportunities with us and we will be making these announcements as they get finalised. This is largely a manifestation or realisation that in the current climate our business model of onshore/offshore hybrid model provides the best cost benefit equation for clients.

Octane Community

We at Octane have always been part of the community and have been hosting User Groups in all the cities we operate in. With the onset of Covid we have stressing our efforts in hosting a monthly User Group meetup. . Our meetups are generally focused on providing tips/tricks and “how to “ sessions for existing user base of Planning Analytics. The registration of the User groups have been increasing steadily.

As a part of our corporate social responsibility undertaking, we also try and support different community groups.. Octane sponsored a Drive in a Lamborghini in the NSW WRX clubs Annual North rally which raises funds for Cystic Fibrosis of NSW. One of friends who I used to race with, Liam Wild, succumbed to the disease in 2012.

This year my kids also started competing in the Motorkhana series with me and this has been great fun and a welcome distraction during the pandemic as we bonded (fought) during the long hours in the garage and practice runs.

Looking back, I would like to express my sincere gratitude for the trust and support Octane has received. With the pandemic here to stay at least until the end of this year, wishing for a blessed and successful 2021 for all.

Image 4
Race Days - trick to beating them is to give them a slower car
 
20200715_141930-2
Mud Bath - Social Distancing done right

20200801_130657-2

 
Clean and ready for next high Octane adventure

 

Line

Session Timeout for TM1Web, PAW and PAX

Mode_Comment_Icon_black0
Alarm_Icon_13 min

We might often get a request from users that TM1 session has logged. As per the client requirement and standards, need to increase or decrease the session timeout. Changing the session timeout is a trade-off, it should not be too big or too small. If it’s too big then many inactive sessions can lead to server performance issues. If it is too small, then the user experience might be affected. 

Each application of TM1 has its separate session timeout parameter. Jump into the respective section, depending upon your need.

TM1Web

1. Go to <Installation Folder>\IBM\cognos\tm1_64\webapps\tm1web\WEB-INF\configuration and Open tm1web_config.xml file.

Screen Shot 2020-08-02 at 8.14.04 am

 

2. Change the HttpSessionTimeout to desired value.

a. Please note timeout value to be mentioned in minutes

 

Screen Shot 2020-08-02 at 8.14.13 am

 

3. Save and close the tm1web_config.xml file.

4. Restart the IBM TM1 Application Server service.


PAX

  1. Go to http://localhost:9510/pmhub/pm/admin. Below screen appears.

 

Screen Shot 2020-08-02 at 8.16.50 am

2. Sign-in using your credential at top right corner.

3. Expand Configurations and go to PMHub Session.

 

Screen Shot 2020-08-02 at 8.17.47 am

 

4. Change MaxInactivity Timeout value. Default value is 3600 sec.

 

Screen Shot 2020-08-02 at 8.17.54 am

 

PAW

1. Go to <PAW Installation Folder>\paw\config and open paw.env and defaults.env file.

 

Screen Shot 2020-08-02 at 8.18.47 am

2. Copy “export SessionTimeout” parameter from defaults.env file and add it to paw.env file with desired value and save.

 

Screen Shot 2020-08-02 at 8.19.02 am

 

 

Try out our TM1 $99 training 
Join our TM1 training session from the comfort of your home

find out more

 

 

Line

Adding images in PAW

Mode_Comment_Icon_black1
Alarm_Icon_14 min

In this article I would like to share a very useful tip on how we can use different methods to adding images in Planning Analytics Workspace; one that is very well known, one that is lesser-known and one that is relatively unknown. I intend to touch base on the first two methods while focusing more on the latter one. 

But before I begin, as I write this blog article, there has been more than 2 million confirmed cases of COVID-19 worldwide with over 130,00 deaths and I wish to take a moment on behalf of Octane Software Solutions and express our deepest condolences with all those and their family members who have directly or indirectly suffered and had been affected by the pandemic and our thoughts go to them. 

And at the same breath a special shout out and our gratitude to the entire medical fraternity, the law enforcement, various NGOs and numerous other individuals, agencies and groups both locally and globally who has been putting their lives at stake to combat this pandemic and help the needy around. Thank you to those on the frontline and the unsung heroes of the COVID-19. It is my firm belief that together we will succeed in the fight.

Back to the topic, one of the most used methods for adding images in PAW is to upload it in some content management and file sharing sites like BOX or SharePoint and paste the web link in the PAW Image Url field. Refer the link below where Paul Young demonstrates how to add an image using this method. 

The other method is to upload your image to an encoding website like https://www.base64-image.de.

This provides a string which can then be pasted as Image url to display the image. Note that it only works on limited file formats and on small sized images.

Also note that albeit the above two methods achieves the purpose of adding images in PAW none of them provides the capability to store the images in a physical drive in order to keep a repository of the images used in PAW easily saved and accessible in your organizations’ shared drive.

The third approach addresses this limitation as it allows us to create a shared drive, store our images in it and then reference it in PAW.

This can be done by creating a website in IIS manager using few simple steps as listed below.

First off, before you can begin, ensure IIS is enabled in your data server as a prerequisite step. This can be done by simply searching IIS in your Windows menu. 

Screen Shot 2020-04-21 at 7.59.26 am

Incase no results are displayed, it means it has not been enabled yet. 

To enable, go to Control Panel à Programs à Turn Windows feature on or off

A wizard opens, click Next. Select Role-based or feature-based installation and click Next.

Screen Shot 2020-04-21 at 7.59.47 am

Select the server if its not already selected (typically data server where you’re enabling the IIS)

Select the Web Server check box and click Next

Screen Shot 2020-04-21 at 8.01.46 am-1

 

Select IIS Hostable Web Core and and click Install.

Image 3-1

 

This installs the required IIS components on the server so we can now proceed to add the website in IIS Manager.

Before adding a website, navigate to C:\inetpub\wwwroot\ and create a folder in this directory. This will be the folder where we will store our images.

Once IIS is enabled follow the below steps:

1. Under Sites right click and select Add Website.

Screen Shot 2020-04-21 at 8.02.14 am

 

2. Update the following configuration settings

a. Site name: Enter the name of the site

b. Physical path: Enter the folder path we created in earlier step

c. Port: Enter any unreserved port number

d. Host name: Enter the machine name

 

Screen Shot 2020-04-21 at 8


Now go to PAW and enter the image URL.

Image 4-1

 

Where ibmpa.jpg is the image saved within PAWImage folder.

Note: This only works in Planning Analytics Local.

 

Octane Software Solutions is a IBM Gold Business Partner, specialising in TM1, Planning Analytics, Planning Analytics Workspace and Cognos Analytics, Descriptive, Predictive and Prescriptive Analytics.

 

Try out our TM1 $99 training 
Join our TM1 training session from the comfort of your home

find out more

 

Line

Automation in TM1 using AutoHotkey

Mode_Comment_Icon_black0
Alarm_Icon_13 min

This blog explains a few TM1 tasks which can be automated using AutoHotKey. For those who don't already know, AutoHotKey is open-source scripting language used for automation.

1. Run TM1 Process history from TM1 Serverlog :

With the help of AutoHotKey, we can read each line of a text file by using loop function and content is stored automatically in an in-built variable of function. We can also read filenames inside a folder using same function and again filenames will be stored in an in-built variable. Therefore, by making use of this we can extract Tm1 process information from Tm1 Serverlog and display the extracted information in a GUI. Let’s go through the output of an AutoHotKey script which gives details of process run.1. 

  • Below is the Screenshot of output when script is executed. Here we need to give log folder and data folder a path.

    Picture1-23
  • After giving the details and clicking OK, list of processes in the data folder of Server is displayed in GUI.

    Picture2-8

    Picture3-5
  • Once list of processes are displayed, double-click on process to get process run history. In below screenshot we can see Status, Date, Time, Average Time of process, error message and username who has executed the process. Thereby showing TM1 process history in TM1 server log. 

    Picture4-4

 

2. Opening TM1Top after updating tm1top.ini file and killing a process thread

With the help of same loop function which we had used earlier, we can read tm1top.ini file and update it using fileappend function in AutoHotKey. Let’s again go through the output of an AutoHotKey script which will open Tm1top.

  • When script is executed, below screen comes up which will ask whether to update adminhost parameter of tm1top.ini file or not.

    Picture7-3
  • Clicking “Yes”, new screen comes up where new adminhost is required to be entered.

    Picture8-4
  • After entering value, new screen will ask whether to update servername parameter of tm1top.ini file or not.

    Picture7-3
  • Clicking “Yes”, new screen comes up where new servername is required to be entered.

    Picture8-4
  • After entering a value, Tm1Top is displayed. For verifying access, username and password is required

    Picture10-2
  • Once access is verified, just enter the thread id which needs to be cancelled or killed.

    Picture11-4

 

Line

Planning Analytics for Excel: Trace TI status

Mode_Comment_Icon_black0
Alarm_Icon_12 min

IBM has been recommending its users to move to Planning Analytics for Excel (PAX) from TM1 Perspective and/or from TM1 Web. This blog is dedicated to clients who have either recently adopted PAX or contemplating too and sharing steps on how to trace/watch TI process status while running process using Planning Analytics for Excel.

Steps below should be followed to run processes and to check TI process status.

1. Once you connect to Planning Analytics for Excel, you will be able to see cubes on the right-hand side, else you may need to click on Task Pane.

 
pax1

 

2. Click on the middle icon as shown below and click on Show process. This will help show all process (to which respective user has access to) in Task Pane.

 
pax2

 

3. You will now be able to see Process.

 

pax3

 

4. To check/ trace status of the process (when triggered via Planning analytics for excel) right-Click on Processes and click Active processes.

 

pax4

 

 

5. A new box will pop-up as shown below.

 
pax5

 

6. You can now run process from Task pane and check if you can track status in new box popped up in step 5.

 

pax6

 

 

7. You can now see the status of process in this box, below is a screen print that shows the for-process cub.price.load.data, process completed 4 tasks out of 5 tasks.

pax7

 

8. Below screen prints tells us if the status of TI process, they are Working , Completed and Process completed with Errors.

pax8

 

Once done, your should be able to to trace TI status in Planning Analytics for Excel. Happy Transitioning.

As I pen down my last Blog for 2019, wishing you and your dear ones a prosperous and healthy 2020.

Until next time....keep planning & executing.

 

Line

PA+ PAW+ PAX (Version Conformance)

Mode_Comment_Icon_black0
Alarm_Icon_15 min

 

PA + PAW + PAX

IBM, with the intention of adding new features to Planning Analytics ( TM1) and to its Visualization and reporting tools Planning Analytics Workspace and Planning Analytics for Excel new release are happening at regular intervals.

New version for Planning Analytics Workspace, Planning Analytics for Excel comes out every 15-40 days, for Planning analytics new version comes out every 3-6 months.

In this blog, I will be discussing about the version combinations to be used between these tools to get better/optimum results in terms of utilization, performance, compatibility and bugs fixes.

Planning Analytics for Excel ‘+’ Planning Analytics Workspace:

There are many versions of Planning analytics for Excel (PAX) from 2.0.1 till 2.0.48 latest version released, similarly we have many versions of planning Analytics Workspace(PAW) starting from 2.0.0 to 2.0.47.

Planning Analytics for Excel though installed, can only be used if Planning Analytics Workspace(PAW) is installed and running. So, the question to be answered is - will all versions of PAX work with all versions of PAW ? – Answer is NO. Yes, you read it right – not all versions of PAX are supported by every version of PAW. We have some versions which are supported and some versions which are Optimal, these are covered below.

Supported Versions :

Planning analytics for Excel (PAX) version will be supported by three versions of Planning Analytics Workspace(PAW), matching version, previous version and next version.  

Here is an example, current PAX version is 2.0.45 and current PAW version being used is 2.0.45. PAX current version will be supported by PAW version 2.0.45 (matching), 2.0.44 ( previous version) and 2.0.46 (next “version”). I have considered two scenarios to explain this better.

Scenario (PAX upgrade):

Say, a decision has been taken to upgrade PAX version to latest version 2.0.48  from 2.0.45, with above explanation, new PAX will only be supported by PAW ( 2.0.47, 2.0.46, 2.0.48). As existing PAW (being used) is 2.0.45, new PAX is not supported. This upgrade activity PAX (2.048), must include PAW upgrade as well. Planning analytics Workspace (PAW )has to be upgraded from 2.0.45 to PAW (2.0.48, 2.0.48, 2.0.49).

Scenario (PAW upgrade):

Say, a decision has been taken to upgrade PAW version(2.0.45) to version 2.0.47 but PAX existing version is 2.0.45 being used by Users.

If the PAW is upgraded to 2.0.47, it will support PAX versions (2.0.47, 2.0.46, 2.0.48) only. If there is PAW upgrade then PAX must be upgraded to either 2.0.46/ 2.0.47/ 2.0.48 versions as part of PAW upgrade activity.

Best suited/ optimal version :

Planning analytics for Excel (PAX) version though supported by three versions( matching, previous, next) of PAW, optimal results are achieved with matching and next version of Planning Analytics Workspace(PAW). 

Here is an example, current PAX version is 2.0.45 and current PAW version being used is 2.0.45. PAX current version, though supported by PAW (2.0.45, 2.0.44 and 2.0.46), optimal are (current and next versions) in this case optimal version is 2.0.45 and 2.0.46.

table 1

Planning Analytics ‘+’ Planning Analytics for Excel:

To check which PAX versions suit Planning Analytics version, we should always consider the bundled PAX/PAW package version as reference to PA.

For example, 2.0.43 PAX version is bundled with 2.0.7 PA version, 2.0.36 PAX is packaged with PA 2.0.6 version.

Supported and Optimal Versions :

Planning Analytics for Microsoft Excel will support three different long cadence versions of Planning Analytics.

  • Planning Analytics version that was bundled with version of Planning Analytics for Microsoft Excel or the most recent Planning Analytics version that was previously bundled with version of Planning Analytics for Microsoft Excel.
  • The two previous Planning Analytics versions before the bundled version.

Here is an example, PAX version 2.0.43 is bundled with 2.0.7 PA.  PAX 2.0.43 will be supported by PA (2.0.7(bundled version), 2.0.6 (previous) and 2.0.5 (previous)). PAX 2.0.43 will not work well with older version, also note that PAA has been introduced in PA 2.0.5 version.

Below table might help with PAX and PA supported/optimal versions.

table

 

For more details click here.

Read some of my other blogs :

Predictive & Prescriptive-Analytics 

Business-intelligence vs Business-Analytics

What is IBM Planning Analytics Local

IBM TM1 10.2 vs IBM Planning Analytics

Little known TM1 Feature - Ad hoc Consolidations

IBM PA Workspace Installation & Benefits for Windows 2016

Octane Software Solutions Pty Ltd is an IBM Registered Business Partner specialising in Corporate Performance Management and Business Intelligence. We provide our clients advice on best practices and help scale up applications to optimise their return on investment. Our key services include Consulting, Delivery, Support and Training. Octane has its head office in Sydney, Australia as well as offices in Canberra, Bangalore, Gurgaon, Mumbai, and Hyderabad. 

 

Try out our TM1 $99 training 
Join our TM1 training session from the comfort of your home

find out more

 

Line

Comparison of Linux vs Windows for IBM Planning Analytics (TM1)

Mode_Comment_Icon_black0
Alarm_Icon_110 min

Linux vs windows

 

If you are thinking of moving to Planning Analytics, then this document can help you in selecting the best Operating System (OS) for your PA installation. Planning Analytics currently supports operating systems listed below: 

  • Windows 
  • Linux 

At Octane, we have the expertise of working on both Windows and Linux environments and number of clients asked which one is the best-fit for their organisation? 

Although its looks like a simple question but hard to answer, we would like to highlight some aids of both to help you to select the best-fit for your organization. 

WINDOWS 

Versions Supported for Planning Analytics: 

  • Windows Server 2008 
  • Windows Server 2012 
  • Windows Server 2016 

Advantages 

  • Graphical User Interface (GUI) 

Windows makes everything easier. Use a pre-install browser to download the software and drivers. Use the install wizard and file explorer to install the software. The Cognos Configuration tool defaults to a graphical interface that's easy to configure. 

  • Single Sign On (SSO) 

If your organisation uses Active Directory for user authenticationsince it’s a Microsoft product so it’s easy to connect and setup Single Sign On (SSO) within Windows OS. 

  • Easy to Support  

It’s easy to do the Admin and maintenance related tasks due to its graphical interface. 

  • Hard to avoid GUI interface completely 

Even if you think to get rid of Windows, as Planning Analytics is GUI based product so it’s difficult to avoid windows environment completely. It’s easy to install and configure Planning Analytics in windows environment. 

  • IBM Support 

Almost all IBM support VMs are running on Windows, so it’s easy for IBM support team to replicate the issue. when they are trying to replicate an issue you might have discovered, it's quick and easy to test. 

LINUX 

Versions Supported for Planning Analytics: 

  • Red Hat Enterprise Linux (RHEL) 8 
  • Red Hat Enterprise Linux (RHEL) Server 6.8, 7.x 

Advantages 

  • Cost effective Linux servers are cost effective i.e. available at low price than Windows. If your organisation is running a large distributed environment than it can add up some costs. 
  • Security  While all servers have susceptibilities that can be locked down, Linux is typically less susceptible and more secure than Windows. 
  • Scripting  If you like to automate processes such as startup/shutdowns and server maintenance, the Linux command line and scripting tools make this easy. 

 There are number of Linux OS versions available in the market and due to this it would be difficult to find the information for a specific version. 

CONCLUSION 

When it comes to selecting an operating system, there is no right or wrong choice. It’s totally depends on the usage and how much comfortable you are with the selected OS. At Octane we do prefer to suggest Windows OS because of its simple UI for install and config as well as the support base. However, if you run a Linux-based shop and have server administrators who are comfortable and prefer Linux, then go with Linux we are here to help you. 

Line

Planning Analytics Secure Gateway: Token Expiry

Mode_Comment_Icon_black0
Alarm_Icon_13 min

Before you read further, please note, this blog details secure Gateway connection used for Planning Analytics deployed “on-cloud” Software as a Service (SaaS) offering.

This blog details steps on how to renew secure gateway Token, either before or after the Token has expired.

What is IBM Secure Gateway:

IBM Secure Gateway for IBM Cloud service provides a quick, easy and secure solution for establishing link between Planning Analytics on cloud and a data source; Typically, an RDBMS source for example IBM DB2, Oracle database, SQL server, Teradata etc. Data source/s can reside either “on-premise” or “on-cloud”.

Secure and Persistent Connection:

By deploying this light-weight and natively installed Secure Gateway Client, a secure, persistent connection can be established between your environment and cloud. This allows your Planning Analytics modules to interact seamlessly and securely with on-premises data sources.

 

Picture1-22

 

How to Create IBM Secure Gateway:

Click on Create-Secure-Gateway and follow steps to create connection.

Secure Gateway Token Expiry:

If the Token has expired, Planning Analytics Models on cloud cannot connect to source systems.

How to Renew Token:

Follow below steps to renew secure gateway token.

  • Navigate to the Secure Gateway
  • Click on the Secure Gateway connection for which the token has expired.
  • Go to Details as shown below and enter number 365 (max limit) beside Expiration days. Here 365 or a year is the maximum time after which the token will expire again. Once done click update.

Picture2-7

This should reactivate your token, TIs should now interact with source system.

 

You may also like reading “ Predictive & Prescriptive-Analytics ” , “ Business-intelligence vs Business-Analytics ” ,“ What is IBM Planning Analytics Local ” , “IBM TM1 10.2 vs IBM Planning Analytics”, “Little known TM1 Feature - Ad hoc Consolidations”, “IBM PA Workspace Installation & Benefits for Windows 2016

Line

Adding customizations to Planning Analytics Workspace

Mode_Comment_Icon_black0
Alarm_Icon_14 min

IBM, in Planning Analytics Workspace’s 2.0.45 release however has addressed some limitations by extending the flexibility to users to upload the fonts and color themes of their choice in Workspace and apply it in their visualisations.

One of the common complaints that I constantly hear from users and have myself put up with when using Planning Analytics Workspace is its lack of available fonts and color pallets for its visualisations.

The lack of this flexibility put  a hard restriction on designing intuitive interfaces and dashboards as we’re limited by only a certain fonts or color combinations provided by the platform. This becomes even more challenging when we had to follow the corporate color scheme and font type but this is no more!

Users can now add new themes by exporting the json file from Administration page in PAW and uploading the file with updated code for new color themes back.

http://colorbrewer2.org/# website offers some sample color pallets to quickly get started with ready to use customised color codes that you can paste in the json file.

Similarly, you may choose any free color picker extension of your choice available in Chrome Web Store to get the hexa code from anywhere within a webpage. 

As for fonts, you can either download free Fonts from Google directly (https://www.fonts.com/web-fonts/google ) or you can go to https://www.fonts.com/web-fonts to purchase a desired font from its wide range of fancy fonts.

Tip: My all time favourite is Webdings font as that allows me to use the fonts as images so it enhances the performance of my dashboard by substituting the images with fonts displayed as icons/image thereby considerably reducing the dashboard refresh time and rendering of the data.

See the full list of graphics that it this font can display from the below link.   http://www.911fonts.com/font/download_WebdingsRegular_10963.htm

Because this is a paid font, it would be highly desired if IBM can incorporate it in its existing list of fonts in PAW, until then it can be downloaded from Microsoft from below link.

https://www.fonts.com/font/microsoft-corporation/webdings

 

Refer the below IBM link to get more info on how to add the fonts and color palettes to PAW.

https://www.ibm.com/support/knowledgecenter/SSD29G_2.0.0/com.ibm.swg.ba.cognos.tm1_prism_gs.2.0.0.doc/c_paw_corp_identity_overview.html

 

To identify where the color pallete code is within the json file, search for keyword “ColorPalette” in Notepad++ and it should list the results which consists of the root branch called ColorPalette and ids that has a unique number suffixed against ColorPalette (see screenshot below)

 

image-81

 

Note: It is not easy to correlate the color palette you see in PAW with its corresponding json code (id) and the only way you can do so is by manually converting the hex code of a color of all ids into a color and then visually inspecting it in PAW, so it’s a bit of a manual process and it becomes even more tedious process when you start to add more color palettes.

And given this complexity, below are some of the key points that you’ve to be vary of when working with palettes in PAW.

  1. The color palettes displayed in PAW under Visualisation details corresponds to the placement order of the code within “ColorPalette” section of the json theme file. So if you have ColorPalette3 code placed above ColorPalette2 code, the second palette you see in PAW correlates to ColorPalette3 code.
    image-81
  2. The heat mapping color, however, corresponds to the numeric value of the id within the json file and not the placement order which is quiet weird. So if we take the same scenario from above, the 2rdnd palette in PAW(which correlates to ColorPalette3) will still apply the heat mapping color of the ColorPalette2. Therefore, it is important to keep the numeric order consistent to easily correlate the code with the palette in PAW.

image-81

  1. Incase same id is being repeated twice with different color codes, the first one that appears in the json file takes precedence and second one is ignored.
Line

TM1 object extensions

Mode_Comment_Icon_black0
Alarm_Icon_113 min

 

This article talks about different extensions/files seen in Data directory and how they are co-related to front-end objects that we see in architect. Since, TM1 is an in-memory tool so all objects seen like cube, dimension, TI process etc seen in Architect are saved in data directory with a specific extension and in a specific way to differentiate them from the other objects.

By understanding these extensions/files, it comes easier in finding objects and deciding which objects are needed to be considered to backup or moving a specific set of changes. Consider the case of taking an entire data folder backup which might take up a large amount of space; instead of only a few objects had under gone changes.  It would be more efficient to take backup of these changes then the complete data directory.

Also, by understanding these extensions and knowing what information it holds, developers can efficiently decide the objects that needed to be moved and their impact on the system when these objects are moved. To have a better understanding, lets divide the objects seen in TM1 into 3 sections i.e., dimension, cube and TI process & also what files are created in data directory.

 

Dimension objects

In this section, we will see what files are created in the data directory and what are they related in front-end interface of architect. To have a better understanding of how the dimension objects seen in architect are stored in data directory, we will take example of a dimensions called "Month" with 2 subsets "All Members" and "Months Only"

 image-55

*.dim File

The dimensions that are seen in Architect are saved in data folder with <DimensionName>.dim extension. The file holds the hierarchy and elements details of that dimension. If the dimension needs to be backup or migrated then we can just take this file. In this example, "Month" dimension seen in architect is saved in "Month.dim" file in the data directory and by reading this file, the architect shows the "Month" dimension and its elements with hierarchy.

In Architect

In Data Directory

image-56 image-57

 

*}Subs Folder

All the Public Subsets that are seen under the dimension in Architect are placed in <DimensionName>}Subs folder respective to that dimension in data folder. In this case, Subsets created for month dimension i.e., "All Members" and "Months Only" are placed in Month}subs Folder

In Architect

In Data Directory

image-58

image-59

 

*.Sub File

The Subsets are created to make it easier to access the set of elements of a dimension and all the subsets of a any dimension are placed in <DimensionName>}Subs folder with <SubsetName>.Sub extension. The Subsets of Month Dimension i.e., "All Members" and "Months Only" are saved in the "Month}subs" Folder as All Members.sub and Months Only.sub

In Architect

In Data Directory -> Month}subs

image-60

image-61

 

Cube Objects

In this section, we will go through the files that are created in data directory for Cube related objects & how are they co-related to the cube objects in the architect. For this case, let’s use the cube "Month_ID" Cube as an example along with its Views "View1" & "View2".

image-62

*.Cub File

The cube and data seen in architect of any cube are saved in <CubeName>.cub file in data directory. So, if only the data needs to copied/moved from different environments, we can do this just by replacing this file for that respective cube. Here, Cube "Month_ID" and its data seen in architect are saved in a file Month_ID.cub in data directory of that TM1 server

In Architect

In Data Directory

image-63 image-64

 

*}Vues Folder

All the public views seen under the respective cube in architect are saved in <CubeName>}Vues Folder of the data directory. In this case, the views "View1" & "View2" of "Month_ID" Cube are saved in Month_ID}Vues Folder of data directory

In Architect

In Data Directory

image-65

image-65

 

*.vue file

All the public views created under a cube are saved in <CubeName>}Vues Folder with a <ViewName>.vue extension. So, the views "View1" & "View2" are saved in Month_ID}Vues Folder as View1.vue and View2.vue

 

In Architect

In Data Directory->Month_ID}vues

image-67 image-68

 

*.RUX file

This is rule file, all rule statements written in Rule Editor for cube can be seen in <CubeName>.RUX file. Here, Rule statements written in rule editor for "Month_ID" cube are saved in Month_ID.rux file of TM1 data directory.

In Architect

In Data Directory

image-69

image-70

 

*.blb file

These files are referred as Blob files and they are used to hold the format data, for example if a format is applied inside Rule of a cube then that data is saved in <CubeName>.blb. Similar to this, if a format style is applied to a view then the format details are saved in <CubeName>.<ViewName>.blb file. In this Case, the format style data applied in rule editor for "Month_ID" cube is saved in Month_ID.blb and the Format style applied to the "View1" is saved in Month_ID.View1.blb file which can be found in TM1 Data Directory.

In Architect

In Data Directory

Format Style data Applied in Rules

 

image-71

Format Style applied in View1

image-72

 

*.feeders file

This file gets generated only when the Persistent Feeders is set "True" in TM1 Configuration file. Once the feeders have been computed in the system, they will be saved in <CubeName>.feeders and this file will be updated in the feeders. Here, Feeder statements present in Rule editor for "Month_ID" are calculated and are saved as Month_ID.feeders

In Architect

In Data Directory

Feeders statements in Rule for Month_ID Cube

image-73

 

TI and Chore Objects

Here, we are going to look at files that are created in data directory for TI processes and chores.

*.Pro file

All TI processes in the Architect are saved in data folder with <TIProcessName>.Pro extension. Now, assume that there is TI process "Month_Dimension_Update" seen in architect then this TI process is saved as Month_Dimension_Update.pro file in data directory.

In Architect

In Data Directory

image-74 image-75

 

*.Cho file

The chore which is used to schedule the TI process is saved in the data folder with <ChoreName>.cho extension. Say, we have to schedule the TI process "Month_Dimension_Update" so we create a chore, "Month_Dim_Update" and this will create a file Month_Dim_Update.cho

In Architect

In Data Directory

image-76 image-77


Application objects

Applications provide the functionality to create virtual folders and this helps in accessing and the orderly sorting of TM1 objects like dimensions, TI Process, Views, Excel reports and so on. When any TM1 Object is added in the Application folder/Virtual folder, it creates a shortcut for that object enabling us to access the object from the shortcut and we can also rename these shortcuts as required.

When these objects are added, in turn they create a file in }Applications Folder of datafiles. These files hold the object information like type, name, reference and so on. Let’s take example of test virtual folder below Application

image-78

You can find these objects in datafiles Folder > }Applications Folder > Test Folder

image-79

image-80

You can find table on how the objects are mapped from frontend architect to backend files in the data folder

Objects in TM1 Application

Object Type

Files create in Application Folder of Data Directory

Test

Virtual Folder

Test Folder

App_Cube_Example

Cube

App_Cube_Example.cube

App_Cube_View_Example

View

App_Cube_View_Example.view

App_Dim_Example

Dimension

App_Dim_Example.dimension

App_Dim_Subset_Example

Subset

App_Dim_Subset_Example.subset

App_TI_Process_Example

TI process

App_TI_Process_Example.process

App_Chore_Example

Chore

App_Chore_Example.chore

You can also add files, URLs and Excel files from the system to the TM1 Application Folder. When we add files like text file, excel file in TM1 Application folder, *.blob files are created in backend of }Applications Folder in data directory. Similarly, *.extr file is created for URL and this file is saved in the TM1 Application Folder.

Also, if we had selected “copy the file to the TM1 Server” then the copy of that file gets saved in the }External Folder of Data Directory . Similarly, When the report is created and upload from Perspective client of TM1 it creates *.blob File and places the file in the }External Folder.

 

 

Line

IBM Planning Analytics for Excel: Bug and its Fix

Mode_Comment_Icon_black0
Alarm_Icon_15 min

Since the launch of Planning Analytics few years back, IBM has been recommending its users to move to Planning Analytics for Excel (PAX) from TM1 Perspective and TM1 Web. As every day new users migrate to adopt PAX, it’s prudent that I share my experiences.

This blog will be part of a series where I would try to highlight and make users aware of different aspects of this migration. This one specifically details a bug I encountered during one of the projects in which our Clients was using PAX and steps taken to mitigate the issue.

 

What was the problem:

Scenario: when a Planning Analytics User triggers a process from Navigation Pane within PAX and uses “Edit parameters” option to enter value for a numeric parameter and clicks save to runs the process.

Issue:  when done this way, the process won’t complete and fail. However, instead if this was run using other tools like Architect, Perspective or TM1 Web, the process would complete successfully.

For example, let’s assume a process, cub.price.load.data takes a number value as input to load data. User clicks on Edit Parameter to enter value and saves it to run. The process fails. Refer screenshots attached.

Using PAX.

Picture1-18    Picture2-6

Picture3-4

 

Using Perspective

Picture4-3

 

What’s causing this:

During our analysis, it was found that while using PAX, when users click on Edit parameter,enter value against the numeric parameter and save it, in the backend the numeric parameter was getting converted into a String parameter thereby modifying the TI process.

As the TI was designed and developed to handle a numeric variable and not a string, a change in type of the variable from Numeric to String was causing the failure. Refer screenshots below.

 Picture5-3

When created,

Picture6-3

Once saved,

Picture7-2

What’s the fix?

Section below illustrates how we mitigated & remediated this bug.

For all TI’s using numeric parameter.

  • List down all TI’s using numeric type in Parameter.
  • Convert the “Type” of these parameters to String and rename the parameter to identify itself as string variable (best practice). In the earlier example, I called it pValue while holding numeric and psValue for String.
  • Next, within the TI in Prolog, add extra code to convert the value within this parameter back to same old numeric variable. Example, pValue =Numbr(psValue);
  • This should fix the issue.

Note that while there are many different ways to handle this issue, it best suited our purpose and the project. Especially considering the time and effort it would require to modify all effected processes.

 

Planning Analytics for Excel : Versions effected

Latest available version (as of 22ndOctober 2019) is 2.0.46 released on 13thSeptember 2019. Before publishing this blog, we spent good time in testing this bug on all available PAX versions. It exists in all Planning Analytics for Excel versions till 2.0.46.

Permanent fix by IBM:

This has been highlighted to IBM and explained the severity of this issue. We believe this will be fixed in next version of Planning Analytics for Excel release. As per IBM (refer image below), seems fix is part of the upcoming version 2.0.47.

Picture8-3 

 

You may also like reading “ Predictive & Prescriptive-Analytics ” , “ Business-intelligence vs Business-Analytics ” ,“ What is IBM Planning Analytics Local ” , “IBM TM1 10.2 vs IBM Planning Analytics”, “Little known TM1 Feature - Ad hoc Consolidations”, “IBM PA Workspace Installation & Benefits for Windows 2016”.

Line

IBM Planning Analytics Secure Gateway Client: Steps to Set-Up

Mode_Comment_Icon_black0
Alarm_Icon_17 min

This blog broaches all steps on how to install IBM Secure Gateway Client.

IBM Secure Gateway Client installation is one of the crucial steps towards setting up secure gateway connection between Planning Analytics Workspace (On-Cloud) and RDBMS (relational database) on-premise or on-cloud.

Picture1-22-1

What is IBM Secure Gateway :

IBM Secure Gateway for IBM Cloud service provides a quick, easy, and secure solution establishing a link between Planning Analytics on cloud and a data source. Data source can reside on an “on-premise” network or on “cloud”. Data sources like RDBMS, for example IBM DB2, Oracle database, SQL server, Teradata etc.

Secure and Persistent Connection :

A Secure Gateway, useful in importing data into TM1 and drill through capability, must be created using TurboIntegrator to access RDBMS data sources on-premise.

By deploying the light-weight and natively installed Secure Gateway Client, a secure, persistent and seamless connection can be established between your on-premises data environment and cloud.

The Process:

This is two-step process,

  1. Create Data source connection in Planning Analytics Workspace.
  2. Download and Install IBM Secure Gateway

To download IBM Secure Gateway Client.

  1. Login to Workspace ( On-Cloud)
  2. Navigate to Administrator -> Secure Gate

Picture2-5

Click on icon as shown below, this will prompt a pop up. One needs to select operating system and follow steps to install the client.
Picture3-3

Once you click, a new pop-up with come up where you are required to select the operating system where you want to install this client.

Picture4-2

Choose the appropriate option and click download.

If the download is defaulted to download folders you will find the software in Download folder like below.

Picture5-2

Installation IBM Secure Gateway Client:

To Install this tool, right click and run as administrator.

Picture6-2

 

Keep the default settings for Destination folder and Language, unless you need to modify.

Picture7-1

Check box below if you want this as Window Service.

Picture8-2

Now this is an important step, we are required to enter Gateway ids and security tokens to establish a secured connection. These needs to be copied over from Secure connection created earlier in Planning Analytics Workspace ( refer 1. Create Data source connection in workspace).

Picture9-2

Figure below illustrates Workspace, shared details on Gateway ID and Security Token, these needs to be copied and pasted in Secure Gateway Client (refer above illustration).

Picture10-1

If user chooses to launch the client with connection to multiple gateways, one needs to take care while providing the configuration values.

  1. The gateway ids need to be separated by spaces.
  2. The security tokens, acl files and log levels should to be delimited by --.
  3. If you don't want to provide any of these three values for a particular gateway, please use 'none'.
  4. If you want Client UI you may choose else select No.

Note: Please ensure that there are no residual white spaces.

Picture11-3

Now click Install, once this installation completes successfully, the IBM Secure Gateway Client is ready for use.

This Connection is now ready, Planning Analytics can now connect to data source residing on-premise or any other cloud infrastructure where IBM Secure Gateway client is installed.

 

You may also like reading “ Predictive & Prescriptive-Analytics ” , “ Business-intelligence vs Business-Analytics ” ,“ What is IBM Planning Analytics Local ” , “IBM TM1 10.2 vs IBM Planning Analytics”, “Little known TM1 Feature - Ad hoc Consolidations”, “IBM PA Workspace Installation & Benefits for Windows 2016”.

Line

What is IBM Watson™ Studio?

Mode_Comment_Icon_black0
Alarm_Icon_11 min

IBM Watson™ Studio is a platform for businesses to prepare and analyse data as well as build and train AI and machine learning models in a flexible hybrid cloud environment.

IBM Watson™ Studio enables your data scientists, application developers and subject matter experts work together easier and collaborate with the wider business, to deliver faster insights in a governed way.

Watch the below for another brief intro



Available in on the desktop which contains the most popular portions of Watson Studio Cloud to your Microsoft Windows or Apple Mac PC with IBM SPSS® Modeler, notebooks and IBM Data Refinery all within a single instal to bring you comprehensive and scalable data analysis and modelling abilities.

However, for the enterprise, there are also the versions of Watson Studio Local, which is a version of the software to be deployed on-premises inside the firewall, as well as Watson Studio Cloud is part of the IBM Cloud™, a public cloud platform. No matter which version your business may use you can start using Watson Studio Cloud and download a trial of the desktop version today!

Over the next 5 days, we'll ensure to send you use-cases and materials of worth for you to review at your earliest convenience. Be sure to check our social media pages for these.

Line

IBM Planning Analytics (TM1) Vs Anaplan

Mode_Comment_Icon_black0
Alarm_Icon_14 min

Picture15-1

IBM Planning Analytics (TM1) vs Anaplan

There has been a lot of chatter lately around IBM Planning Analytics (powered by TM1) vs Anaplan. Anaplan is a relatively new player in the market and has recently listed on NYSE. Reported Revenue in 2019 of USD 240.6M (interestingly also reported an operating loss of USD 128.3M). Compared to IBM which has a 2018 revenue of USD 79.5 Billion (there is no clear information on how much of this was from the Analytics area) with a net profit of 8.7 b). The size of global Enterprise Performance Management (EPM) is around 3.9 Billion and expected to grow to 6.0Billion by 2022. The size of spreadsheet based processes is a whopping 60 Billion (Source: IDC)

Anaplan has been borne out of the old Adaytum Planning application that was acquired by Cognos and Cognos was acquired by IBM in 2007. Anaplan also spent 176M on Sales and Marketing so most people in the industry would have heard of it or come across some form of its marketing. (Source: Anaplan.com)

I’ve decided to have a closer look at some of the crucial features and functionalities and assess how it really stacks up.

Scalability 

There are some issues around scaling up the Anaplan cubes where large datasets are under consideration (8 billion cell limit? While this sounds big, most of our clients reach this scale fairly quickly with medium complexity). With IBM Planning Analytics (TM1) there is no need to break up a cube into smaller cubes to meet data limits. Also, there is no demand to combine dimensions to a single dimension. Cubes are generally developed with business requirements in mind and not system limitations. Thereby offering superior degrees of freedom to business analyst.

For example, if enterprise wide reporting was the requirement, then the cubes may be need to be broken via a logical dimension like region of divisions. This in turn would make consolidated reporting laborious, making data slicing and dicing difficult, almost impossible.

 

Picture14-1-1  

                                                                                                                                   

Excel Interface & Integration

Love it or hate it – Excel is the tool of choice for most analyst and finance professionals. I reckon it is unwise to offer a BI tool in today’s world without a proper excel integration.  I find Planning Analytics (TM1) users love the ability to use excel interface to slice and dice, drill up and down hierarchies and drill to data source. The ability to create interactive excel reports with ability to have cell by cell control of data and formatting is a sure-shot deal clincher.

On the other hand, on exploration realized Anaplan offers very limited Excel support.

 Picture11-2Picture12-1

 

 Analysis & Reporting

In today’s world users have come to expect drag and drop analysis. Ability to drill down, build and analyze alternate view of the hierarchy etc “real-time”. However, if each of this query requires data to be moved around cubes and/or requires building separate cubes then it’s counterproductive. This would also increase the maintenance and data storage overheads. You also lose sight of single source of truth as your start developing multiple cubes with same data just stored in different form. This is the case with Anaplan due to the software’s intrinsic limitations.

Anaplan also requires users to invest on separate reporting layer as it lacks native reporting, dashboards and data visualizations.

This in turn results in,

  1. Increase Cost
  2. Increase Risk
  3. Increase Complexity
  4. Limited planning due to data limitations

IBM Planning Analytics, on the contrary offers out of the box ability to view & analyze all your product attributes and the ability to slice and dice via any of the attributes. 

It also comes with a rich reporting, dashboard and data visualization layer called Workspace. Planning Analytics Workspace delivers a self-service web authoring to all users. Through the Planning Analytics Workspace interface, authors have access to many visual options designed to help improve financial input templates and reports. Planning Analytics Workspace benefits include:

  1. Free-form canvas dashboard design
  2. Data entry and analysis efficiency and convenience features
  3. Capability to combine cube views, web sheets, text, images, videos, and charts
  4. Synchronised navigation for guiding consumers through an analytical story
  5. Browser and mobile operation
  6. Capability to export to PowerPoint or PDF

Picture13-1

Source : Planning Analytics (TM1) cube

Line

Planning Analytics - Cloud Or On-Premise

Mode_Comment_Icon_black0
Alarm_Icon_14 min

cloudsaas-1

This Blog details IBM Planning Analytics On-Cloud and On-Premise deployment options. It focusses & highlights key points which should help you make the decision; “whether to adopt Cloud Or stay on Premise”

 

IBM Planning Analytics:

As part of their continuous endeavour to improve application interface and better customer experience, IBM rebranded TM1 to Planning Analytics couple of years back which came with many new features and a completely new interface. With this release (PA 2.x version as it has been called), IBM is letting clients choose Planning Analytics as Local SW or as Software as a Service (SaaS) deployed on IBM Softlayer Cloud.

cloud-vs-on-premise-1280x720-1

Planning Analytics on Cloud:

Under this offering, Planning Analytics system operates in a remote hosted environment. Clients who choose Planning Analytics deployed “on-cloud” can reap many benefits aligned to any typical SaaS.

With this subscription, Clients’ need not worry about software Installation, versions, patches, upgrades, fixes, disaster recovery, hardware etc.

They can focus on building business models and enriching data from different source systems and give meaning to the data they have. This by converting data into business critical, meaningful, actionable insights.

Benefits:

While not a laundry list, covers significant benefits.

  • Automatic software updates and management.
  • CAPEX Free; incorporates benefits of leasing.
  • Competitiveness; long term TCO savings.
  • Costs are predictable over time.
  • Disaster recovery; with IBM’s unparalleled global datacentre reach.
  • Does not involve additional hardware costs.
  • Environment friendly; credits towards being carbon neutral.
  • Flexibility; capacity to scale up and down.
  • Increased collaboration.
  • Security; with options of premium server instances.
  • Work from anywhere; there by driving up productivity & efficiencies.

Client must have Internet connection to use SaaS and of course, Internet speed plays major role. In present world Internet connection has become a basic necessity for all organizations.

Picture11-1

Planning Analytics Local (On-Premise):

Planning Analytics local essentially is the traditional way of getting software installed on company’s in-house server and computing infrastructure installed either in their Data Centre or Hosted elsewhere.

In an on-premise environment - Installation, upgrade, and configuration of IBM® Planning Analytics Local software components are on the Organization.

Benefits of On-Premise:

  • Full control.
  • Higher security.
  • Confidential business information remains with in Organization network.
  • Lesser vendor dependency. 
  • Easier customization.
  • Tailored to business needs.
  • Does not require Internet connectivity, unless “anywhere” access is enabled.
  • Organization has more control over implementation process.

As evident on-premise option comes with some cons as well, few are listed below.

  • Higher upfront cost
  • Long implementation period.
  • Hardware maintenance and IT cost.
  • In-house Skills management.
  • Longer application dev cycles.
  • Robust but inflexible.

On-premise software demands constant maintenance and ongoing servicing from the company’s IT department.

Organization on on-premise have full control on the software and on its related infrastructure and can perform internal and external audits as and when needed or recommended by governing/regulatory bodies.

Before making the decision, it is also important to consider many other influencing factors; from necessary security level to the potential for customization, number of Users, modelers, administrators, size of the organization, available budget, long term benefits to the Organization.

While you ponder on this, there are many clients who have adopted a “mid-way” of hybrid environment. Under which basis factors like workload economics, application evaluation & assessment, security and risk profiles, applications are being gradually moved from on-premise to cloud in a phased manned.

 

You may also like reading “ What is IBM Planning Analytics Local ” , “IBM TM1 10.2 vs IBM Planning Analytics”, “Little known TM1 Feature - Ad hoc Consolidations”, “IBM PA Workspace Installation & Benefits for Windows 2016”.

For more Information: To check on your existing Planning Analytics (TM1) entitlements and understand how to upgrade to Planning Analytics Workspace (PAW) reach out to us at info@octanesolutions.com.au for further assistance.

Octane Software Solutions Pty Ltd is an IBM Registered Business Partner specialising in Corporate Performance Management and Business Intelligence. We provide our clients advice on best practices and help scale up applications to optimise their return on investment. Our key services include Consulting, Delivery, Support and Training. Octane has its head office in Sydney, Australia as well as offices in Canberra, Bangalore, Gurgaon, Mumbai, and Hyderabad.

To know more about us visit, OctaneSoftwareSolutions.

Line

Is Your Data Good Enough for Business Intelligence Decisions?

Mode_Comment_Icon_black0
Alarm_Icon_13 min

business-intelligence-1

There’s no question that more and more enterprises are employing analytics tools to help in their strategic business intelligence decisions. But there’s a problem - not all source data is of a high quality.

Poor-quality data likely can’t be validated and labelled, and more importantly, organisations can’t derive any actionable, reliable insights from it.

So how can you be confident your source data is not only accurate, but able to inform your business intelligence decisions? It starts with high-quality software.

 

Finding the right software for business intelligence

There are numerous business intelligence services on the market, but many enterprises are finding value in IBM solutions. 

IBM’s TM1 couches the power of an enterprise database in the familiar environment of an Excel-style spreadsheet. This means adoption is quick and easy, while still offering you budgeting, forecasting and financial-planning tools with complete control.

Beyond the TM1, IBM Planning Analytics takes business intelligence to the next level. The Software-as-a-Service solution gives you the power of a self-service model, while delivering data governance and reporting you can trust. It’s a robust cloud solution that is both agile while offering foresight through predictive analytics powered by IBM’s Watson.

 

business-intelligence-3-1-1

 

Data is only one part of the equation

But it takes more than just the data itself to make the right decisions. The data should help you make smarter decisions faster, while your business intelligence solution should make analysing the data easier. 

So how do you ensure top-notch data? Consider these elements of quality data:

  • Completeness: Missing data values aren’t uncommon in most organisations’ systems, but you can’t have a high-quality database where the business-critical information is missing.
  • Standard format: Is there a consistent structure across the data – e.g. dates in a standard format – so the information can be shared and understood?
  • Accuracy: The data must be free of typos and decimal-point errors, be up to date, and be accurate to the expected ‘real-world’ values.
  • Timeliness: Is the data ready whenever it’s needed? Any delays can have major repercussions for decision-making.
  • Consistent: Data that’s recorded across various systems should be identical. Inconsistent datasets – for example, a customer flagged as inactive in one system but active in another – degrades the quality of information.
  • Integrity: Is all the data connected and valid? If connections are broken, for example if there’s sales data but no customer attached to it, then that raises the risk of duplicating data because related records are unable to be linked.

Are you looking to harness the power of your source data to make actionable business decisions? Contact Octane to find out how we can help you leverage your data for true business intelligence.

 

business-intelligence-2-1-1

 

Line

Self Service: How Big Data Analytics is Empowering Users

Mode_Comment_Icon_black0
Alarm_Icon_13 min

big-data-analytics-1

 

Smart businesses are seeking out new ways to leverage the benefits of their big data analytics programs, and the self-service model is coming up trumps. By placing the onus directly on business users, enterprises are empowering customers with insights-driven dashboards, reports, and more. But it’s not the only bonus. 

Arguably an even greater upside for organisations is that it alleviates the talent shortage that often comes with big data. With most companies only employing a handful of data experts who can deliver analytics insights to customers, the self-service model means they are freed up to concentrate on more important tasks, while allowing the masses to derive their own insights on their own terms. 

 

What are the real benefits of self service?

If nothing else, a self-service model creates a ‘democratisation’ of big data, giving users the freedom to access the data they need when they need it most: during the decision-making process.

Moreover, there’s a low cost to entry – coupled with reduced expenses thanks to freeing up data science and IT resources – and faster time to insight. When users know what they need and can change their research strategies according to new and changing demands, they become more empowered.

But it’s not all smooth sailing – giving customers the tools they need for self service is only one part of the equation. They must also be educated on the potential pitfalls.

 

big-data-analytics-2-1

 

Avoid the common hurdles

When several users have access to specific data, there’s a risk of multiple copies being made over time, thus compromising the ‘one version of truth’ and possibly damaging any insights that could be derived.

Business users unfamiliar with big data analytics are also prone to mistakes, as they may be unaware of data-preparation complexities – not to mention their own behavioural biases. 

For all these issues, however, education is the solution, which is what Ancestry.com focused on when it began encouraging self-service analytics through its new data-visualisation platform. And with 51 quintillion cells of data you can see why.

 

There’s no harm in starting small with big data analytics

Ancestry.com has over 10 billion historical records and about 10 million registered DNA participants, according to Jose Balitactac who is the FP&A Application Manager.

The old application they were using was taking hours to do the calculations.  They looked at seven different applications before deciding on IBM Planning Analytics.  

The reason they chose IBM Planning Analytics was to accommodate the company’s super-cube of data, other solutions would have required them to “break it into smaller cubes, or reduce the number of dimensions, or join members, such as business units and cost centers.” They didn’t want to do that because their processes worked.

They set up a test with IBM to time how long it took for the model to calculate and it took less than 10-20 seconds which is what they wanted. You can read more about the Ancestry.com case study here.

If you’re keen to empower your business users through a self-service model, contact Octane today to learn how we can help you harness big data analytics.

 

big-data-analytics-3-1-1

 

Line

A TM1 Guide on How To for Dummies

Mode_Comment_Icon_black0
Alarm_Icon_18 min

Contents
  1. How to Create Dynamic Parameters for TM1
  2. How to Build a Scorecard using TM1 Architect
  3. How to Enhance TM1 Security on a 'Need to Know' Basis
  4. How to Use Stargate Views for TM1 Cube Viewer

 

 

How to Create Dynamic Parameters for TM1

The CheckFeedersMaximumCells is a dynamic parameter that allows users to restrict and control performing check feeders operation in a cube for a selected number of cells. As per the documentation, the default value is 3,000,000, meaning feeders for consolidation with up to 3 million intersections could be checked against, by default.

Case in point: I’ve added the CheckFeedersMaximumCells parameter in tm1s.cfg file and set the value to 1, implying that the Check Feeders should only apply to not more than 1 cell.

CheckFeedersMaximumCells =1

Post the change, let’s look at real life results of adding this parameter:

 Dynamic Parameters

In the above screenshot, I’ve performed a Check Feeder operation on one cell and it aptly shows the result.

However, if I were to do the same on a consolidation, “PG1”, it throws an error as illustrated in the screenshot below.

dynamic parameters 1 (1)

Note 1: If the intersections or parameter value exceeds 99, the number format is displayed in scientific notion.

Note 2: If you would like to effectively disable checking feeders for whatever reason, you can do so by assigning -1 value for the config parameter.

 

 

How to Build a Scorecard using TM1 Architect

Cube used: Sales_Plan

Dimensions: Subsidiaries, Channel, Product, Month, Versions and Sales_Plan_Measures

 

To Add Traffic Lights in TM1 Architect follow the steps:

1. Enhance the Metrics Dimension

A Metric Dimension contains your collection of important measures or key performance indicators (KPI) that you want to monitor in your business.

  1. Add 2 Attributes (Numeric) in ‘Sales_Plan_Measures’ Dimension namely ‘Performancepattern’ and Tolerancetype’.
  2. Enter a value of 2 for ‘Unit Cost’ element in ‘Tolerancetype’ Attribute and let rest of the elements be 0.

 scorecard attributes TM1

2. Enhance the indicator dimension

Scorecarding, a Metric indicators dimension provides more information about your key performance indicators (KPI) or metrics. Examples of metric indicators include Score, Status, and Trend.

  1. Insert a New Attribute in ‘Versions’ Dimension and name it as ‘Renderer’.
  2. For Status member enter the ‘Renderer’ value as ‘trafficlight’.
  3. For Trend member enter the ‘Renderer’ value as ‘metrictrend’.

scorecard attributes TM1 renderer

 3. Enhance Time Dimension

  1. Enter a New Attribute in ‘Month’ Dimension and name it as ‘Previous’.
  2. Populate the ‘Previous’ Attribute with the invariant name of the preceding period from the level. (Example: For Feb enter ‘Jan’and so on).

 

 

How to Create a TM1 Security Overlay

Security Overlay is a type of cell security which restrict user’s ability to write to a cube, without causing change on the dimensions and without needing to change the underlying TM1 Security. Security Overlay prevent updates to cell data by all users, except Administrater.

Using Security Overlay Cube you can define the restriction to only selected dimensions of the Cube, as with Cell Security. Security Overlay does not apply to an Admin User, whereas the Cell Security can be applied to Admin including other users.

The Security Overlay cube is prefixed just like other Security Cubes.

} SecurityOverlayGlobal_CubeName

Security Overlay cube contains all the mapped dimensions from the cube and the last dimension is the ‘}SecurityOverlay Dimension’.

security overlay

}SecurityOverlay dimension defines the data stored in the Overlay cube. It contains a single string element called ‘OverlayData’, which stores the data that is used to implement the Overlay. In OverlayData value ‘1’ must be entered to restrict write access in all intersections corresponding to the dimensions in Security Overlay cube.

To create Security Overlay cube via TI, use ‘SecurityOverlayCreateGlobalDefault’ function must be used.

This function can be called either in metadata or prolog tab as the functions that updates or create security must not be used in Data or Epilog tabs

security overlay global default

As per the above screenshot the price refers to the cube name. There are 5 dimension in the Price cube. Number ‘1’ indicates that the dimension is included in the Security Overlay cube and ‘0’ indicate the dimension is excluded.

As per the TI function in above screenshot it creates a Security Overlay cube with 1st, 2nd and 4th Dimensions.

Below is the screenshot of the Security Overlay Cube View

Security Overlay Cube

As per the above screenshot populated value 1 is entered in the intersection of ‘Department Store’, ‘Budget Version 1’ and ‘Cooking Gear’ which results in giving read access for all the intersections that has ‘Department Store’, ‘Budget Version 1’ and ‘Cooking Gear’.

Below is the snapshot of how the Security Overlay Cube View looks like after applying Security Overlay to the other users

Security Overlay Other Users

As per the above screenshot the intersection of Department Store, Cooking Gear and Budget Version 1 is not editable and all the other cells are editable.

The way Security Overlay Security diffrenciates from TM1 Cell Security is that TM1 Cell Security can be given different privileges like READ, WRITE, LOCK, NONE, RESERVE and ADMIN, whereas in Security Overlay Cube everyone is given READ access except the Admin. Cell Security can be given to all users including Admin.

There is no requirement to execute any process after u update the Overlay Security Cube and the impact of cell security comes into effect immediately. There will be no time lag or need of refreshing the security.

‘SecurityOverlayGlobalLockNode’ function must be used to restrict the access rights of a node to read-only by locking it. 

Security Overlay Cube can be deleted or destroyed using a ‘SecurityOverlayDestroyGlobal’ Turbointegrator function.

To create Cell Security cube via TI, use ‘CellSecurityCubeCreate’ function must be used, similarly to ‘CellSecurityCubeDestroy’ function can be used to destroy Cell Security Cube.

 

 

How to Use Stargate Views For TM1 Cube Viewer 

A Stargate View is a subsection of TM1 cube which is created by TM1 when you browse a cube with the Cube Viewer. It is also known as ‘View Cache’. Stargate View is different from a TM1 view object, and hence does not contain the formatting information and browser settings that are available in a view object.

Stargate View helps in accessing the cube data more rapidly than the TM1 object (Actual Cube View). Stargate View contains only the data for a defined section of a TM1 cube.

Stargate View contains only the data defined by the title elements and row and column subsets.

TM1 stores a Stargate view when you access a view that takes longer time to retrieve than the threshold defined by the VMT property in the }CubeProperties control cube. If a VMT value is not defined, a Stargate view will get generated when a view takes longer than five seconds. This is the default threshold when VMT is not specified in the }CubeProperties control cube.

A Stargate view persists in memory until the browser view from which it originates remains unchanged. If the view is changed and you recalculate the browser view, TM1 replaces the existing Stargate view in memory with the new one based on the recalculated view.

TM1 removes the Stargate View or Cache when you close the browser view and it will also be removed if you restart the TM1 Application.

TM1 provides some settings to control the Stargate View namely VMT (View Minimum Time) and VMM  (View Maximum Memory), which can be seen in ‘}CubeProperties’ control cube.

VMM is expressed in KB. If we don’t specify any value, then a default value of 128 is used.

VMT is in seconds. If no value is specified, then a default value of 5 is used.

 

Try out our TM1 $99 training 
Join our TM1 training session from the comfort of your home

find out more

 

Line

How CFOs can improve a company's efficiency by automating the process

Mode_Comment_Icon_black0
Alarm_Icon_12 min

As the CFO of your company, your organisation relies on you for more than just numbers. Yes, spreadsheets and financial reports are the hallmarks of your profession.

But what about business intelligence and forecasting?

Today, the explosion of data available in business matches the explosion of competition in today’s markets and CFOs need to harness that wealth of data to compete. CFOs need better business intelligence tools to identify upcoming opportunities and grab them before their competitor does. They also need to foresee potential losses and advise on how to mitigate risk. In short, they need to focus on analytics to be able to predict the future so their business owners can react quickly.

Predicting the Future with Business Intelligence Solutions

 

How-CFOs-can-improve-a-company's-efficiency.jpg

Despite the high tech dashboards available, many CFOs are still trying to reach high performance using acquainted tools like Excel. Ashamed of trying to compete in the hottest markets, many CFOs feel like they’re trying to build a fire with flint and steel.

The CFOs of advanced companies are no longer content to see their staff pumping out spreadsheets that need hours of explanation and analysis. With the overwhelming amount of data available, better business intelligence and reporting demand sophisticated tools to drive a company from the base camp to the summit. CFOs must be able to visualise inter-relational data groups, see the connection among KPIs, identify the potential for profit and envision the future while strategising.

What they need to boost performance is a financial reporting dashboard to accumulate data from different business units and provide insights into alternative strategies. Once various data sets are connected visually, patterns emerge, extrapolation is possible, and the future seems clearer.


So why should CFOs reassess their current Business Intelligence tools?

Because they know they are wasting time creating graphs and spreadsheets. Creating confounding reports doesn’t compare business metrics at a glance and in the end, they know they can improve efficiency by having visual data immediately accessible to key business units.

Recently, Octane Software Solutions launched the FREE Business Intelligence Assessment to help companies gain more value from their current BI and analytics investments as well as provide road mapping to create cost effective BI strategies.

 Find out more: http://info.octanesolutions.com.au/ibm-cognos-tm1


 

Line

Take complexity out of Excel and systemize data with IBM Cognos TM1

Mode_Comment_Icon_black0
Alarm_Icon_12 min

IBM Cognos TM1 supported by Planning Analytics is the ideal technology for making Excel-centric budgeting, forecasting, financial reporting and any spreadsheet-based analytical application scale to the enterprise.

IBM's TM1 for Microsoft Excel brings the power and familiarity of Excel to the arena of modern enterprise performance management. Planning Analytics, on the other hand, enables business and financial analysts, line-of-business managers and others to explore and analyse data from a variety of different sources—including IBM Cognos TM1 and IBM Cognos Business Intelligence—without IT support.

Thanks to IBM Planning Analytics, Microsoft Excel users can:

IBM-Planning-Analytics,-Microsoft-Excel.jpg


1. Explore and analyse data

  • Use one add-in to analyse all your data: dimensional or relational, performance management or business intelligence, proprietary IBM data or third-party sources.
  • Drag and drop members, dimensions and views from the metadata tree of the company's TM1 model directly into spreadsheet columns or rows using the Exploration interface.
  • Link and refresh planning and operational data sources in the same Excel worksheets, by connecting to Cognos TM1 and Cognos Business Intelligence through a single analysis interface.
  • Select and filter data, then define layouts to gain a managed view of information through a single, intuitive interface.

2. Systemise data

  • Use Microsoft Excel capabilities to enrich business scenarios and present results that are easy to understand and reuse.
  • Create objects in Cognos TM1 models and reuse them in the spreadsheet environment of IBM Planning Analytics.

3. Share results

  • Publish Cognos Business Intelligence-sourced reports, including new column and row calculations, in standard Cognos reports and dashboards.
  • Engage teams and individuals from across the organisation for more inclusive and representative decision-making and analysis.
  • - Deliver analysis results to the processes and people at the front line of the business.

4. Extend analysis capabilities

  • Extend access to Cognos TM1 data to a broader range of users to improve their decision-making.
  • Perform flexible analysis and solve business problems using familiar spreadsheet tools and techniques.
  • Analyse data from different sources and build reports and dashboards without the need for IT assistance.
  • Build control panels to make it easy to change information in a report with a single click.

 

If you would like to find out more about IBM Planning Analytics and IBM Cognos TM1 capabilities, give us a call and let's discuss the future of your business.

 

Got a question? Shoot!

Lorem ipsum dolor sit amet, consectetur adipiscing elit, sed do eiusmod tempor incididunt ut labore et dolore magna aliqua.

Get more articles like this delivered to your inbox