<img src="https://trc.taboola.com/1278851/log/3/unip?en=page_view" width="0" height="0" style="display:none">
finding octane
Content_Cut_Icon Twitter_Brands_Icon

Unlocking the Power of IBM Planning Analytics with Execute HTTP Request

Mode_Comment_Icon_white0
Alarm_Icon_1_white4 min

Today, we’re excited to explore a game-changing function that enhances the versatility of your Planning Analytics platform. Imagine a tool that not only streamlines your data processes but also connects your Planning Analytics seamlessly with external systems. This innovation allows you to execute HTTP requests directly within your TurboIntegrator (TI) processes, transforming your Planning ...

down-arrow-blue
Book_Open_Solid_Icon

Today, we’re excited to explore a game-changing function that enhances the versatility of your Planning Analytics platform. Imagine a tool that not only streamlines your data processes but also connects your Planning Analytics seamlessly with external systems. This innovation allows you to execute HTTP requests directly within your TurboIntegrator (TI) processes, transforming your Planning Analytics into an integral part of your interconnected ecosystem. Join us as we delve into the possibilities this function brings and how it can elevate your data management strategies to new heights.

Image

Why Execute HTTP Request is a Game-Changer

What makes this function truly versatile is its ability to connect with any external system that supports APIs. The only limit is your imagination and the capabilities of the APIs you wish to connect to. Integrating Planning Analytics with external systems allows developers to break free from traditional limitations and extend the functionality of their applications.

This session will include practical demonstrations of several use cases that highlight the power of the Execute HTTP Request. By the end, I hope to inspire you to explore how you can leverage this function to enhance your TM1 applications and workflows.

Demo 1: Hot Promotion of Objects Between Instances

Let's dive into our first demonstration on how to perform hot promotion of objects from one instance to another. Traditionally, migrating objects between instances involved shutting down the target server. However, using the Execute HTTP Request, we can do this in real-time.

  1. Setting Up the TI Process:
     
    • Open the workbench in your workspace and create a new TI process.
    • Declare necessary constants and set your source and target instances (e.g., SmartCo to Demo Server).
  2. Use of HTTP Execute Request:
     
    • Fetch dimensions from the source instance and check for their existence in the target instance.
    • For non-existing dimensions, save them as JSON files and use the HTTP Execute Request to migrate them to the target instance.

Let’s execute this process! Once completed, you’ll see that the dimensions have been successfully migrated.

Demo 2: Executing a Process Across Instances

Next, we'll demonstrate the ability to execute a process from one instance in another:

  1. Migrate TI Processes:
     
    • Similar to dimension migration, retrieve the TI process (like Sample TI) from the source instance and save it as a JSON file.
  2. Execute the TI Process:
     
    • Use the Execute HTTP Request to trigger execution from the target instance, while utilising its response to capture status codes and log outputs.

After running this process, you should see that both the process has been migrated and executed successfully.

Demo 3: Loading Currency Conversion Rates

In this demo, we will load real-time currency conversion rates from a website using its API:

  1. Call the API:
     
    • Set up an HTTP GET request to retrieve USD conversion rates.
  2. Extract and Utilise Data:
     
    • Capture the JSON response and extract required currency rates using JSON functions.

image (2)-1

Run the process, and you will observe the real-time conversion rates being fetched and displayed.

Demo 4: Sending Teams Notifications

Next, I’ll show you how to send automated notifications to Microsoft Teams:

  1. Integrate with Microsoft Power Automate:
     
    • Set up a Power Automate flow to send notifications.
  2. Trigger Notification from System:
     
    • Use Execute HTTP Request to trigger alerts in Teams based on process execution results.

image-Feb-05-2025-07-23-13-1010-AM

After execution, you should see notifications appear in your Teams channel.

Demo 5: Sending Emails via HTTP Requests

Finally, we'll explore how to send emails:

  1. Power Automate for Email Notifications:
     
    • Again, set up Power Automate to manage email sending through appropriate HTTP requests.
  2. Dynamic Email Content:
     
    • Utilise dynamic fields for subject and body based on execution results.

After executing this process, you will receive the email in your mailbox.

Conclusion

Today, we have unlocked the extensive capabilities of the Execute HTTP Request function in IBM Planning Analytics. We showcased hot promotion between instances, cross-instance process execution, real-time data fetching, as well as integration with Microsoft Teams and email notifications. 

Thank you all for attending this session. I hope you found it beneficial and feel inspired to explore the functionality of Planning Analytics further. Let’s move toward a more integrated and dynamic future in our analytics processes!

Leave a comment

Line

Navigating the Storm: A Double Migration

Mode_Comment_Icon_black0
Alarm_Icon_14 min

The past few weeks have been a whirlwind, a high stakes balancing act that tested the limits of our team's resilience and expertise. We simultaneously managed two major client migrations. Both were for high profile large clients and part of their IBM Planning Analytics Modernisation initiative:

TM1 newsletter (8)

The clients

News Corp - News Corp Australia tells the stories that matter to 18.2 million Australians every month as an important part of News Corp, a diversified global media and information services company. From breaking news in the morning to deciding dinner that night, Australia trusts our brands to inform, inspire and delight across the day – including The Australian, The Daily Telegraph, Herald Sun, The Courier-Mail, The Advertiser, Mercury, NT News, Townsville Bulletin, The Cairns Post, Gold Coast Bulletin, Geelong Advertiser, news.com.au, Vogue, GQ, Kidspot, taste.com.au and plenty more. More here https://www.newscorpaustralia.com/

A long-time TM1 user was looking at refreshing its application and modernise it by going to the cloud and utilising new dashboarding capabilities of the workspace and starting some testing on AI capabilities for the Finance team. TM1 is one of the core applications within the Finance team and they could not afford to run the risk of a prolonged upgrade to the cloud.

BlueScope They are a global leader in metal coating and painting products for the building and construction industries, providing vital components for houses, buildings, structures, vehicles, and more.

They have built a solid foundation for growth with a diverse portfolio of businesses in some of the largest and fastest-growing economies of the world. They are headquartered in Australia, with our people and operations spread across North America, Australia, New Zealand, the Pacific Islands, and throughout Asia. More here https://www.bluescope.com/

BlueScope is also a long-term TM1 user with the application used for a number of areas including demand planning, forecasting and Reporting in the Finance teams. They were on an older version of on–prem TM1 and upgrading to the latest version of on-prem IBM Planning Analytics TM1

A Perfect Storm

Both projects presented unique challenges. NEWS's migration required significant user training and change management, while BlueScope's upgrade involved complex technical configurations and intricate coordination with multiple stakeholders. To make matters even more challenging, both go-lives were scheduled for the same day!

Overcoming the Odds

How did we navigate this perfect storm?

  • Strong Leadership: Clear and decisive leadership was crucial in keeping the projects on track. By setting clear expectations, prioritising tasks, and making timely decisions, we were able to mitigate risks and ensure smooth execution.
  • Effective Teamwork: Our team demonstrated exceptional teamwork and collaboration. By working closely together, we could share knowledge, support each other, and address challenges proactively.
  • Agile Methodology: We adopted an agile approach, breaking down the projects into smaller, manageable phases. This allowed us to adapt to changing circumstances and deliver value incrementally. We have built up a comprehensive checklist for our upgrade TM1 upgrades and this makes upgrades easier and risk free.
  • Robust Communication: Open and transparent communication was key to keeping all stakeholders informed and aligned. Regular status updates, clear documentation, and effective problem-solving ensured a smooth transition.

Lessons Learned

These experiences have taught us valuable lessons:

  • Prioritise and Plan: Careful planning and prioritization are essential, especially when managing multiple projects simultaneously.
  • Embrace Flexibility: Be prepared to adapt to unexpected challenges and changes in scope.
  • Build Strong Relationships: Strong relationships with clients and team members are crucial for successful project delivery.
  • Learn from Mistakes: Analyse past projects to identify areas for improvement and avoid repeating errors.

Successful outcomes for both clients

The upgrade was successful for both clients and went live on the same day. This was a testament to my team’s technical ability and tenacity to ensure we followed our upgrade checklist. All testing and end-user training went well. One of our key tenets of upgrades is to ensure that client communication around changed functionality and look and feel is explained, trained and tested. The stakeholders from our client side were great and the whole team went above and beyond during the upgrade and deployment.

Both the upgrade took under 6 weeks to complete with minimal disruption to the business.

A huge shoutout goes to Alpheus and Rajan for their stellar work on the NEWS migration, ensuring a smooth transition to the Cloud, and to Baburao for expertly managing the BlueScope upgrade, and overcoming every hurdle that came our way.

Your Turn

Have you faced similar challenges in managing multiple simultaneous projects? How did you overcome them? Share your experiences and insights in the comments below.

If you are looking to modernise or upgrade your IBM Planning Analytics, then contact us and we would be happy to guide you. 

Line

Enhancing Planning Analytics Workspace (PAW) visualisations using MDX

Mode_Comment_Icon_black0
Alarm_Icon_15 min

Planning Analytics Workspace (PAW) offers a robust suite of visualizations, enabling users to create rich and compelling reports and dashboards with remarkable flexibility. However, even with these capabilities, you may occasionally encounter requirements that push the limits of what PAW provides out of the box. 

One such scenario I encountered was the need to create a column chart comparing Actual vs Budget variance. The twist? Any negative variance should be highlighted with a red bar, while positive variance should be displayed in green, as shown below: 

A screenshot of a computer screen
Description automatically generated

PAW’s default settings don't currently offer this kind of custom conditional formatting for visualizations. However, with a little MDX magic and a few formatting tweaks, you can achieve this effect in just five simple steps. 

Step-by-Step Guide to Creating Custom Visualizations in PAW 

Step 1: Position the Version Dimension in the Column 

Start by positioning the Version dimension in the column of the Exploration view. This is where we will apply the MDX logic to derive the desired results. 

Step 2: Use MDX to Create Calculated Members 

Next, you'll need to update the MDX query by creating three calculated members to represent Actual vs Budget (AvB), Positive Variance, and Negative Variance. 

Here’s the MDX code: 

MDX code: 

WITH  

MEMBER [Version].[Version].[AvB] AS [Version].[Version].[Actual] - [Version].[Version].[Budget]  

MEMBER [Version].[Version].[Positive] AS IIF([Version].[Version].[AvB] > 0, [Version].[Version].[AvB], "") 

MEMBER [Version].[Version].[Negative] AS IIF([Version].[Version].[AvB] < 0, [Version].[Version].[AvB], "") 

Note: The AvB calculation could also be done using a consolidated member in the Version dimension, where the Budget has a negative weight. 

Step 3: Replace the MDX in the Row Axes 

Now, replace the MDX in the Row Axes relating to the Version dimension to show only the Positive and Negative calculated members, while excluding the AvB calculation (and any other member): 

MDX code: 

    EXCEPT( 

        { 

            [Version].[Version].[AvB], 

            [Version].[Version].[Positive], 

            [Version].[Version].[Negative] 

        },  

        { 

            [Version].[Version].[AvB] 

        },  

        ALL 

    ) 

This MDX will generate a view that displays only Positive and Negative members in the Version dimension, leaving the non-relevant member (whether positive or negative) as blank, depending on the AvB value. 

Step 4: Convert the Exploration View into a Column Chart

Once the MDX has been applied, convert the Exploration view into a Column Chart. By default, PAW will show the columns for positive and negative values with its standard color scheme.  

A graph of a number of states
Description automatically generated

 

Step 5: Apply a Custom Color Palette 

To finalize the visualization, we’ll apply a custom color palette. Navigate to the visualization properties and create a color palette that includes only two colors: green for positive values and red for negative values. 

A blue and white flag
Description automatically generated with medium confidence

Conclusion 

With just a few lines of MDX and a bit of customization, you can significantly enhance PAW visualizations. This technique allows you to move beyond the standard out-of-the-box options, giving you the flexibility to create more intuitive and visually effective reports. Whether you're comparing Actual vs Budget or any other metrics, these methods help you build visuals that not only convey the necessary information but do so in a way that is easy to interpret at a glance. 

By leveraging MDX and PAW’s formatting tools, you can push the boundaries of your reporting and create dynamic, insightful dashboards tailored to your business needs. 

 

Line

Integrating transactions logs to web services for PA on AWS using REST API

Mode_Comment_Icon_black0
Alarm_Icon_15 min

In this blog post, we will showcase the process of exposing the transaction logging on Planning Analytics (PA) V12 on AWS to the users. Currently, in Planning Analytics there is no user interface (UI) option to access transaction logs directly from Planning Analytics Workspace. However, there is a workaround to expose transactions to a host server and access the logs. By following these steps, you can successfully access transaction logged in Planning Analytics V12 on AWS using REST API.

integratetranslogs-ezgif.com-optimize

Step 1: Creating an API Key in Planning Analytics Workspace

The first step in this process is to create an API key in Planning Analytics Workspace. An API key is a unique identifier that provides access to the API and allows you to authenticate your requests.

  1. Navigate to the API Key Management Section: In Planning Analytics Workspace, go to the administration section where API keys are managed.
  2. Generate a New API Key: Click on the option to create a new API key. Provide a name and set the necessary permissions for the key.
  3. Save the API Key: Once the key is generated, save it securely. You will need this key for authenticating your requests in the following steps.

Step 2: Authenticating to Planning Analytics As a Service Using the API Key

Once you have the API key, the next step is to authenticate to Planning Analytics as a Service using this key. Authentication verifies your identity and allows you to interact with the Planning Analytics API.

  1. Prepare Your Authentication Request: Use a tool like Postman or any HTTP client to create an authentication request.
  2. Set the Authorization Header: Include the API key in the Authorization header of your request. The header format should be Authorization: Bearer <API Key>.
  3. Send the Authentication Request: Send a request to the Planning Analytics authentication endpoint to obtain an access token.

Detailed instructions for Step 1 and Step 2 can be found in the following IBM technote:

How to Connect to Planning Analytics as a Service Database using REST API with PA API Key

Step 3: Setting Up an HTTP or TCP Server to Collect Transaction Logs

In this step, you will set up a web service that can receive and inspect HTTP or TCP requests to capture transaction logs. This is crucial if you cannot directly access the AWS server or the IBM Planning Analytics logs.

  1. Choose a Web Service Framework: Select a framework like Flask or Django for Python, or any other suitable framework, to create your web service.
  2. Configure the Server: Set up the server to listen for incoming HTTP or TCP requests. Ensure it can parse and store the transaction logs.
  3. Test the Server Locally: Before deploying, test the server locally to ensure it is correctly configured and can handle incoming requests.

For demonstration purposes, we will use a free web service provided by Webhook.site. This service allows you to create a unique URL for receiving and inspecting HTTP requests. It is particularly useful for testing webhooks, APIs, and other HTTP request-based services.

Step 4: Subscribing to the Transaction Logs

The final step involves subscribing to the transaction logs by sending a POST request to Planning Analytics Workspace. This will direct the transaction logs to the web service you set up.

Practical Use Case for Testing IBM Planning Analytics Subscription

Below are the detailed instructions related to Step 4:

  1. Copy the URL Generated from Webhook.site:
    • Visit siteand copy the generated URL (e.g., https://webhook.site/<your-unique-id>). The <your-unique-id> refers to the unique ID found in the "Get" section of the Request Details on the main page.

  1. Subscribe Using Webhook.site URL:
    • Open Postman or any HTTP client.
    • Create a new POST request to the subscription endpoint of Planning Analytics.
    • In Postman, update your subscription to use the Webhook.site URL using the below post request:

  • In the body of the request, paste the URL generated from Webhook.site:

{
 "URL": "https://webhook.site/your-unique-id"
}
<tm1db> is a variable that contains the name of your TM1 database.

Note: Only the transaction log entries created at or after the point of subscription will be sent to the subscriber. To stop the transaction logs, update the POST query by replacing /Subscribe with /Unsubscribe.

By following these steps, you can successfully enable and access transaction logs in Planning Analytics V12 on AWS using REST API.

Line

Mastering Calculations in Planning Analytics: Adapt to Changing Months with Ease

Mode_Comment_Icon_black0
Alarm_Icon_16 min

One of the standout features of Planning Analytics Workspace (PAW) is its ability to create calculations in the Exploration view. This feature empowers users to perform advanced calculations without the need for technical expertise. Whether you're using PAW or PAfE (Planning Analytics for Excel), the Exploration view offers a range of powerful capabilities. The Exploration view supports a variety of functions, such as aggregations, mathematical operations, conditional logic, and custom calculations. This means you have the flexibility to perform complex calculations tailored to your specific needs. 

This enables users to create complex financial calculations and business rules within the views, providing more accurate and tailored results for analysis and planning. All this can be done by the business users themselves without relying on IT or development teams, enabling faster and more agile reporting processes. This enables creating ad hoc reports and performing self-service analysis on the fly with a few simple clicks. This self-service capability puts the control in the hands of the users, eliminating the need for lengthy communication processes or waiting for IT teams to fulfill reporting requests.

In this blog post, we will focus on an exciting aspect of the Exploration view: creating MDX-based views that are dynamic and automatically update as your data changes. The beauty of these dynamic views is that users no longer need to manually select members of dimensions to keep their formulas up to date.

Similar to the functionality of dynamic subsets in dimensions, where each click in the set editor automatically generates MDX statements that can be modified, copied, and pasted, the exploration views in Planning Analytics Workspace also generate MDX statements. These MDX statements are created behind the scenes as you interact with the cube view. Just like MDX subsets, these statements can be easily customized, allowing you to fine-tune and adapt them to your specific requirements.

By being able to tweak, copy, and paste these MDX statements, you can easily build upon previous work or share your calculations with others.

Currently, the calculations are not inherently dynamic, however, there are techniques that can be employed to make the calculations adapt to changing time periods.

A classic example we can look at is performing variance analysis on P&L cube where we wish to add a variance formula to show the variance of current month from the previous month. There are many more calculations that we can consider from but we will focus on this analysis in this blog.

If we take our example, the current month and previous month keep changing every month as we roll forward and they are not static. When dealing with changing months or any member in your calculation, it's important to ensure that your calculations remain dynamic and adaptable to those changes. 

To ensure dynamic calculations that reflect changes in months, you have several options to consider:

Manual Approach: You can manually update the column dimensions with the changing months and recreate the calculations each time. However, this method is time-consuming, prone to errors, and not ideal for regular use.

Custom MDX Approach: Another option is to write custom MDX code or modify existing code to reference the months dynamically from a Control cube. While this approach offers flexibility, it can be too technical for end users.

Consolidations Approach: Create consolidations named "Current Month" and "Prior Month" and add the respective months to them as children. Then, use these consolidations in your view and calculations. This approach provides dynamic functionality, but you may need to expand the consolidations to see the specific months, which can be cumbersome.

Alias Attributes Approach: Leverage alias attributes in your MDX calculations. By assigning aliases to the members representing the current and previous months, you can dynamically reference them in your calculations. This approach combines the benefits of the previous methods, providing dynamic calculations, visibility of months, and ease of use without excessive manual adjustments.

In this blog post, we will focus on the alias attributes approach as a recommended method for achieving dynamic calculations in PAW or PAfE. We will guide you step-by-step through the process of utilizing alias attributes to ensure your calculations automatically adapt to changing months. By following this approach, you can simplify your calculations, improve efficiency, and enable non-technical users to perform dynamic variance analysis effortlessly.

To create dynamic calculations for variances between the current and prior month, you can follow these steps:

  • Step 1: Ensure you have an alias attribute available in your Month dimension. If not, create a new alias attribute specifically for this purpose.
  • Step 2: Update the alias with the values "Curr Month" and "Prior Month" for the respective months.
  • Step 3: Open the exploration view in PAW and select the two months (current and prior) on your column or row dimension. 
  • Step 4: Create your variance calculation using the exploration view's calculation capabilities. This could involve subtracting the P&L figures of the prior month from the current month, for example.
  • Step 5: Open the MDX code editor and replace the actual month names in the MDX code with the corresponding alias values you updated in Step 2. You can copy the code in Notepad and use the "Find and Replace" function to make this process faster and more efficient.

ezgif.com-video-to-gif (3)

By replacing the month names with the alias values, you ensure that the calculation remains dynamic and adapts to the changing months without manual intervention. When you update the alias values in the Month dimension, it will reflect in the exploration view. As a result, the months displayed in the view will be dynamically updated based on the alias values. This ensures that your calculations remain synchronized with the changing months without the need for manual adjustments.


Important Note: When selecting the months in set editor, it is crucial to explicitly select and move the individual months from the Available members' pane (left pane) to the Current set pane (right pane). This step is necessary to ensure that unnecessary actions, such as expanding a quarter to select a specific month, are not recorded in the MDX code generated in the exploration view which can potentially lead to issues while replacing the member names with alias values. 

This approach of using alias attributes to make calculations dynamic can be extended to various other calculations in Planning Analytics Workspace. It provides a flexible and user-friendly method to ensure that your calculations automatically adapt to changing dimensions or members.

That being said, it's important to note that there may be certain scenarios where alternative approaches, such as writing custom MDX code or utilizing a control cube, are necessary. Each situation is unique, and the chosen approach should align with the specific requirements and constraints of the calculation, however the proposed approach should still work for a wide variety of calculations in IBM Planning Analytics.

Line

Exploring the Latest Enhancements of IBM Planning Analytics Components

Mode_Comment_Icon_black0
Alarm_Icon_13 min

As the world moves towards more data-driven decision-making, businesses are increasingly looking for effective planning and budgeting solutions. IBM Planning Analytics is the go-to for businesses looking for a comprehensive set of tools to help them manage their budgeting and planning process.

Slide2

With Planning Analytics, businesses can access powerful analytics to make more informed decisions, leverage advanced features to create complex models, and gain better insights into their financial data.

IBM is constantly improving the functionalities and features of the IBM Planning Analytics components. This includes Planning Analytics Workspace (PAW), Planning Analytics for Excel (PAfX), and Planning Analytics with Watson. With these updates, businesses can take advantage of new features to help them manage their budgeting and planning process more effectively.

In the last 12 months, IBM has released several updates to its Planning Analytics components.

In PAW, users can now access advanced analytics such as forecast simulations, predictive models, and scenario analysis. They can also perform in-depth analysis on their data with the new Visual Explorer feature. In addition, users can now access a library of planning and budgeting models, which can be customized to fit the needs of their organization. (download PDF file to get the full details)

Slide3download PDF file to get the full details

 

Slide6download PDF file to get the full details

In PAfX, users can now access advanced features such as SmartViews and SmartCharts. SmartViews allows users to visualize their data in various ways, while SmartCharts allows users to create interactive charts and graphs. Users can also take advantage of the new custom formatting options to make their reports look more professional.

Slide7download PDF file to get the full details

 

Slide8download PDF file to get the full details

Finally, with Planning Analytics with Watson, users can access powerful AI-driven insights. This includes AI-driven forecasting, which allows users to create more accurate forecasts. In addition, Watson can provide insights into the drivers of their business, allowing users to make more informed decisions.

 

Slide9download PDF file to get the full details

 

Overall, IBM’s updates to the Planning Analytics components provide businesses with powerful tools to help them manage their budgeting and planning process. With these updates, businesses can take advantage of the latest features to quickly access data-driven insights, create more accurate forecasts, and gain better insights into their financial data.

Download the PDF file below to get the full version of each IBM Planning Analytics components.

Line

Planning Analytics Workspace Local Distributed

Mode_Comment_Icon_black0
Alarm_Icon_12 min

PAW Local Distributed is an upgrade to Planning Analytics Local Workspace that can be deployed in a container orchestration engine using either Docker Swarm by Docker or Kubernetes - an open source by Google, for high availability, fail-over, scalability, and fault tolerance in multiple application servers or virtual or even cloud machines.

Architecture:

 

AMIN1

 

The Planning Analytics Workspace Distributed is run on a Swarm mode by deploying the application on multiple Docker nodes (with unique node ids) also known as Swarm.

 

Docker Engine CLI could be used to create a swarm and deploy and manage the application services in swarm.

Swarm mode ensures secured connection across multiple servers. Additionally, some of the key features that Swarm mode offers includes:

Cluster management integrated with Docker Engine,

Declarative service model,

Desired state reconciliation,

Horizontal Scaling and Load Balancing,

Multi-host networking,

Automatic service discovery,

Service Discovery,

Rolling updates with roll-back

 

AMIN2

 

The Docker engine maintains high availability by effectively scheduling the failed node’s task to other nodes.

It was released in 2.0.41 version of Planning Analytics Workspace and is available to be downloaded from IBM Fix Centre from below link.

https://www-945.ibm.com/support/fixcentral/swg/selectFixes?product=ibm%2FInformation+Management%2FIBM+Planning+Analytics+Local&fixids=BA-PAWL-2.0.41&source=dbluesearch&function=fixId&parent=Analytics%20Solutions

Note: The Docker Swarm is currently supported on Red Hat Enterprise Linux (RHEL) only.

The Docker Enterprise Edition for RHEL could be downloaded from the following link:

https://docs.docker.com/install/linux/docker-ee/rhel/

 

 

Line

Views integration in PAX and PAW

Mode_Comment_Icon_black0
Alarm_Icon_13 min

In the latest version of PAX (2.0.44) as well as PAW (2.0.44), IBM has added a new capability to allow the views to be shared between PAW and PAX using Planning Analytics Workspace Content Store.

 

What does this mean for us?

We can now save the views in content store and share it across to be accessed from either PAW or PAX which was not possible earlier. This further strengthens the integration and make the PAX and PAW interoperation even more seamless which a good step as far as usability and software portability is concerned.

 

Steps to access the views from PAW content store from PAX:

We already know how to save the views in content store within PAW – this is done by clicking the save button in the view and selecting the destination folder in content store.

To access the views saved in content store, click at View icon available on PAX toolbar that opens the folder structure of the content store.

Navigate to the folder where the view is saved then select the desired Report Type from the bottom and click Select.

 

A1

 

Steps to save the views in PAW content store from PAX:

Note: This applies only to Exploration Views in PAX. For the other report types, they can be published to TM1 Applications folder from PAX and access in PAW. The saved exploration view when opened in PAW, opens as a normal view.

Open a view in Exploration mode and click Save View icon in PAX toolbar under Exploration tab.

You will be provided with two options to choose from:

  1. Save to Content Store: This saves the view in PAW Content Store
  2. Save to Server: This saves the view to TM1 data server.

 

A2

When saving the view to Server, there is an option to save it as MDX view, however, please note that if            this is checked, the view will not be accessible from Architect as Architect does not support MDX based            views.

       2.1 When saving the view



A3-1

 

Once the view is saved in PAW Content Store, it can then be accessed from PAW and any changes made to the view and saved in either of the platforms will be reflected across.

*Tip: You may still be able to create a view in PAW and save it to TM1 Server so it can be accessed from both PAX and Architect, however be mindful that if you make any change to the view in PAW and try to override the view, that will not be possible. It will only give you an option to save the view in PAW Content Store. It is possible to update the view and save the changes in PAX that is reflected across all platforms but currently not in PAW.

To create a view in PAW, right click the Views and select ‘Create view’. This view when saved will be saved in TM1 database directory and is visible in both PAX and Architect both.

 


A4
 
 
Line

PAX and PAW 2.0.41

Mode_Comment_Icon_black0
Alarm_Icon_13 min

What's interesting to note is that Planning Analytics Workspace (PAW) version 2.0.40 and 2.0.41 are combined so that the release of Planning Analytics Workspace 2.0.41 aligns with the IBM Planning Analytics for Microsoft Excel (PAX) release.

IBM-planning-analytics-1


With several fixes applied to both PAW and PAX in this version; this post will focus more on the enhancements and features. However, should you want to review the fixes, you can find those here.

Features:

With this release, you will now be able to create and edit drill through rules and processes in Workspace.

 

paw_2


Also finding users who don't have any permissions is quick and straightforward by using the new menu.

 

paw_search_levels2


A new quality of life enhancement is that grid refreshes can be configured to automatically refresh on new views when it's n level data changes.

 

paw_3


Database configuration parameters are now set from the database activity report.

 

paw_4


With this other metrics such as threads blocked will also now appear on the database activity report.

 

paw_5


The set editor allows you to now define which levels of a hierarchy to include in a report with dynamic ranges.These can be defined as Level >= level002.

 

paw_search_levels2-1

 

There are individual icons for dynamic and static sets helping determine whats dynamic in the hierarchy and what is not.

 

paw_6


Planning Analytics Workspace can now be Distributed on Docker Swarm but currently supported on Red Hat Enterprise Linux only.

You can use Constrained Calculations in PAX to narrow the scope of the recalculates to just your active worksheet; Increasing the performance and speed of the worksheet.

Whilst this covers the main features; there is plenty more to read, follow this link to IBM for more.

 

Line

Unraveling TM1 : Lesser Known Facets – Part B

Mode_Comment_Icon_black0
Alarm_Icon_17 min

Thank you if you have come back for more! Hope our last blog Unraveling TM1 : Lesser Known Facets – Part A was meaningful. In this part we will unearth & explore few more of these lesser known gems.

As always, if you like what we do and want to associate; subscribe to our Blogs at http://blog.octanesolutions.com.au

 

Function TM1RPTROW

We know TM1RPTROW is a salient function when building demand and rolling forecasts in Planning and Budgeting models

Although everyone is aware of the fact that parameters like Dimension Subset, MDX Expressions are a part of TM1RPTROW, developers tend to assume the following features are either hard to achieve or are time consuming.

TIP 4. Search Functionality on the elements of TM1RPTROW

Subset elements can be filtered by wildcard search in Subset editor; we all now this. Similar approach has been used to look for an element in websheets.

As an alternative to subsets in TM1RPTROW, let’s look into another parameter which serves the purpose - MDX

Syntax: TM1RptRow(ReportView, Dimension, Subset, SubsetElements, Alias, ExpandAbove,MDXStatement, Indentations, ConsolidationDrilling)

S1(1)

As an illustration, consider a TM1 Websheet created to understand how it could be done.

As shown in the screenshot below, a search option has been provided with for a model from the view.

 

Create the MDX expression which is to be referred in a cell as illustrated below

S2

In the screen shot, MDX (Cell I11) is getting filtered based on the data provided in cell E19.

This is when the MDX parameter of TM1RPTROW must be updated to refer to the Cell I17(named as sLMDX) as shown in screenshot below.

S3

It can be seen that MDX Parameter is updated to refer the MDX expression only if search cell has some value and not when null (in the case as TM1RPTROW functionality Subset will take precedence)

Resulting a view as per the search expression; Provide a wildcard expression (which is supported in TM1) and refresh the sheet.

View will be refreshed with data for only the searched elements as shown below. Thereby delivering the purpose.

S4

 

TIP 5. Switching between two different views in TM1 Web

Though not a requisite to have to switch between 2 different views, there comes times and a business need which may require you to deliver different views.

Consider an Example of standard IBM model Salescube in Sdata Instance.

In Model Dimension of SalesCube, we have model elements S Series, L Series and T Series. User Case: ‘S Series’ and ‘T Series’ model (in this case being car models) need to be forecasted/Budgeted for future years based on the actuals of L Series.

S5

To accomplish this, the Web screen should show ‘L Series’ as selected actuals and ‘T Series’, ‘S series’ for Forecast/Budget version.

For illustration purpose, consider the TM1 Websheet below…

As we know by now, TM1RPTROW has a parameter to provide Dimension Subset.

We will create 3 subsets which has L Series elements, T-Series & S-Series elements respectively

S6

 

Definitely not as complicated as it may sound, When actual is selected:

S7

From the screenshot, it is clear that when “actvsbud” is changed from Actual to any other Version, the view of a websheet will change (as specified in TM1RPTROW) once refreshed. Developers can use Nested IF, if the requirement is to have different view sets for each selection in the dimension.

Refreshing the websheet results in (refer image below);

S8-2

TIP 6. Locking rows to restrict user entry

Think SECURITY; there would be scenarios when users are provided with view access only into TM1 Web applications.

While security is good and essential, there would be times when the underlying cube has huge volumes of data and applying cell security may result in performance issues.

While formatting (locking the cells/row of a particular measure) in excel is an option, formatting an area in TM1 active form can be used extensively to avoid cell security.

Consider an example (refer fig below), the need is 1.6 Series to be non-editable.

Using excel ‘IsNumber’ and ‘search’ function look for 1.6 in the TM1RPTROW elements (as shown in the below screenshot). Name the row as L to setup formatting in format area of active form.

S9 

Insert a row in the format area and name the row as L as shown below.

S10 

Format colour as user’s requirement and lock the cells as shown below.

S11 

Hide rows and columns within active forms which are not meant for users, protect the sheet in review tab as shown below.

S12 

Hope you would have enjoyed reading this blog as much as I had testing these cool features; so until next time, keep planning on IBM Planning Analytics!

You may also like reading:

What is IBM Planning Analytics Local

IBM TM1 10.2 vs IBM Planning Analytics

All you need to know about Planning Analytics 2.0.5

Little known TM1 Feature - Ad hoc Consolidations

IBM PA Workspace Installation & Benefits for Windows 2016

101 Guide to Blockchain

TI Optimisation – An Epilogue

To Subscribe, visit http://blog.octanesolutions.com.au

 

For more Information: To check on your existing Planning Analytics (TM1) entitlements and understand “how to”, reach out to us at info@octanesolutions.com.au

Octane Software Solutions is an IBM Registered Business Partner specializing in Corporate Performance Management and Business Intelligence. We provide our clients advice on best practices and help scale up applications to optimise their return on investment. Our key services include Consulting, Delivery, Support and Training.

Octane has its head office in Sydney, Australia as well as offices in Canberra, Bangalore, Gurgaon, Mumbai, and Hyderabad.

To know more about us visit, OctaneSoftwareSolutions.

 

 

Line

Unraveling TM1 : Lesser Known Facets – Part A

Mode_Comment_Icon_black0
Alarm_Icon_16 min

No matter how much we think we know about TM1, there are always these set of conditions that make you think “what-if there was a better way”??!! For example, not having to write a bunch of codes just to delete the subsets after using it in a Source View Or not writing the same area definition twice. What-if this could be done with less or no coding at all!

This blog and its subsequent part will focus on showcasing few of such many lesser known features in TM1. These little tricks and tips are a step towards better code management and of course peace of mind.

 

TIP 1. One area definition for two different rules at N and C level

Let’s take an example of Headcount in an Employee (HR Data) cube.

  • Headcount is calculated monthly based on on-roll employee count in a particular month.
  • However, if we pan-out at an All Months level (as illustrated in the figure below), total headcount adds up from Jan to Dec, which is not a true representation of employee count.

J1-1

  • Correcting this needs a rule to be written at a consolidated level which will pick the headcount from Dec.
  • Instead of writing a separate line of rule with same Area definition, different rules can be clubbed and written for both N and C level separated by a semicolon (as illustrated in the figure below).

J2

 

TIP 2. Error file directory

On any given day, you would hardcode the directory path or fetch it from a cube where the path is stored, this to ensure dynamism and to address issues with regards to data movement from one server to another.

But if you are not using this cube often or for any other purpose, it soon becomes redundant. There is a work around in which use of GetProcessErrorDirectory function offers the path and allows you to log into directory of respective server instances.

A folder inside logging can be created (as illustrated in the example below) and the file directed to that path, so even if/when the code is moved to different server/instance, it still works seamlessly.

Code snippet shows the use of function:

J3

Exporting the exception to the file:

J4

Output file:

J5

TIP 3. Temporary Subset and Views

 In Turbo Integrator (TI), to process a cube view, it is a good practise to delete view and subsets created in Prolog to reduce redundancy. This is commonly done in Epilog tab of process using various functions.

But there is a much simpler way to delete views and subsets without the need of writing code in Epilog.

The screenshot below illustrates a code snippet, highlighted portion is the secret recipe; Adding “1” as the third parameter to functions ViewCreate, SubsetCreate or SubsetCreatebyMDX considers the views and subsets as temporary and deletes them once the process is completed.

This improves overall performance as deleted temporary objects can’t create lock, as a result of which TI doesn’t need to wait for locks to be released before executing a temporary object.

J6

Note: These functions are available from 10.2.2 FP4 onwards.

 

Hope you would have enjoyed reading this blog as much as I had testing these cool features; stay tuned for Part B of this blog series on Lesser Known Facets of TM1. To Subscribe, visit http://blog.octanesolutions.com.au

You may also like reading: 

What is IBM Planning Analytics Local

IBM TM1 10.2 vs IBM Planning Analytics

All you need to know about Planning Analytics 2.0.5

Little known TM1 Feature - Ad hoc Consolidations

IBM PA Workspace Installation & Benefits for Windows 2016

101 Guide to Blockchain

TI Optimisation – An Epilogue

 

For more Information: To check on your existing Planning Analytics (TM1) entitlements and understand “how to”, reach out to us at info@octanesolutions.com.au

Octane Software Solutions is an IBM Registered Business Partner specializing in Corporate Performance Management and Business Intelligence. We provide our clients advice on best practices and help scale up applications to optimise their return on investment. Our key services include Consulting, Delivery, Support and Training.

Octane has its head office in Sydney, Australia as well as offices in Canberra, Bangalore, Gurgaon, Mumbai, and Hyderabad.

To know more about us visit, OctaneSoftwareSolutions.

Line

Caveat Around Concurrent Data Loads - Part B

Mode_Comment_Icon_black0
Alarm_Icon_14 min

Welcome back, for those who had read Part A of this blog, hope you would have tested Parallel Interaction and benefitted from it. For those who haven’t, you can read it at http://blog.octanesolutions.com.au/caveat-around-concurrent-data-loads-part-a

This blog builds upon the previous one and focusses on improving performance by Parallel Interaction, it highlights TM1 facets worth consideration while an object is locking.

 

Synopsys

Tips towards improvements around support for concurrent read/write and/or parallel execution of turbo integrator process enabling higher efficiencies and productivity.

 

Analysis

  • Declare unique view and subset names

This allows to run a Turbo Integrator process without locking when executed by concurrent users simultaneously.

 

  • Establish Cube Dependency

This nullifies the chance of a query or process triggering a cube dependency; this usually happens during periods of user activity that may block objects and cause contention issues for concurrent reads and writes.

To establish a cube dependency, include AddCubeDependency function and valid only in the Turbo Integrator process. This function creates a manual dependency between two cubes in the model.

Syntax: AddCubeDependency('CubeA','CubeB');

Argument

Description

CubeA

The name of the base cube.

CubeB

The name of the dependent cube.

Cube B relies on a rule that is dependent on Cube A.

 

  • Use a ViewConstruct function in Turbo Integrator processes

This will increase the speed of Turbo Integrator process as it stores a stargate view in memory on the server. The purpose of ViewConstruct function is to cache the view by pre-calculating and storing large views for quick retrieval post a data load or update.

Syntax: ViewConstruct(CubeName, ViewName);

Argument

Description

CubeName

The cube from which you want to construct the view.

ViewName

The view you want to construct. This view must be an existing public view on the server.

 

  • Dimension Maintenance (or Dimension read/write)

This places locks on read and write operations in any cube that includes dimension updates. Thrashing may result in locks during dimension maintenance (read/write to dimensions) as part of executing the Turbo Integrator process.

Note: Do not include dimension maintenance as part of data upload or updates.

 

  • Maintaining an Attribute (Alias)

This again places locks on read and write in any cube that includes dimension updates.

Thrashing may result in locks during dimension maintenance (read/write to dimensions) as part of executing the Turbo Integrator process.

Note: Do not include attribute alias maintenance as part of data upload or updates.

 

For more Information: To check on your existing Planning Analytics (TM1) entitlements and understand “how to”, reach out to us at info@octanesolutions.com.au.

Octane Software Solutions Pty Ltd is an IBM Registered Business Partner specializing in Corporate Performance Management and Business Intelligence. We provide our clients advice on best practices and help scale up applications to optimise their return on investment. Our key services include Consulting, Delivery, Support and Training.

Octane has its head office in Sydney, Australia as well as offices in Canberra, Bangalore, Gurgaon, Mumbai, and Hyderabad.

Get a free one-hour consultation on us

To know more about us visit, OctaneSoftwareSolutions.

Line

Sandbox Analysis. Delivered

Mode_Comment_Icon_black0
Alarm_Icon_15 min

In this blog, would like to introduce you to a new feature called "Compare Sandboxes"; this feature, an extension to Planning Analytics Workspace’s existing capability, shall be available both on Cloud and in Local versions.

 

What is a Sandbox? (Jump to “Feature Explained” section if you are an existing TM1 user & know this well)

A Sandbox lets you create your own personal workspace, your own version where you can enter & store data-value-changes without impacting the base (actual) data. 

  • A sandbox is not a copy of the base data, but a separate overlay or a layer of your own data values that you have entered on top of the base data.
  • Sandboxed data is your own personal work area, its private to each user and cannot be seen by others.
  • Once a sandbox is created, user can run multiple iterations on the data set without effecting the base data.
  • When user commits the sandboxed data back into the base data, the changed values are then visible to others.
  • Sandbox help user explore different business scenarios, for example, user may create best case, average case, worst case.

 

So, What’s new??

While “sandboxing” has been around for some while now, users have been asking (almost demanding) for a “Compare Sandbox” feature.  This in essence is the possibility of comparing different scenarios (like best vs average vs worst) to help users analyse and drive better business decisions. This would also help streamline scenarios well before they are committed in the system.

 

Feature Explained:

As we now know, earlier versions of Planning Analytics (TM1) did not have an option where in users could compare different sandboxes, Now, users with entitlements to Planning Analytics Workspace (PAW) licenses can not only create personal scenarios in sandboxes but also view them side-by-side to compare and analyse the cause & its effect.

This was made possible as PAW considers Sandbox names as elements of dimension called Sandboxes (refer fig 1).

Figure 1Picture1

 

For illustration, you can now display your Best & Worst case sandbox scenarios next to each other in nested columns, and then, calculate the variance, as shown below. 

In this example, we have two sandboxes Sri_BCase and Sri_WCase.

In Figure 2 Sandboxes are same as base data. BCase-WCase is on-fly calculation which gives variance between Sri_BCase and Sri_WCase.       

Figure 2 

Picture2

 

In Figure 3 Sandbox Sri_BCase, budgeted Units for S Series 2WD for world has been increased by 3%, Sri_WCase still holds base data. We can now see variance between Sandboxes in below figure.

Figure 3

Picture3

In Figure 4 Sandbox Sri_WCase, budgeted Units for S Series 2WD for world has been increased by 0.5%, Sri_BCase holds 5% increased data earlier. We can now see variance between Sandboxes.

Figure 4

Picture4

In Figure 5 this data can also be visualized using different graphs with in PAW. In this case Stack Bar has been used to Visualise data.

Figure 5

Picture5

 

In Figure 6, we can also visualise the impact of these changes on other values like Sales, Price and other related measures.

Figure 6

Picture6

What would I like to see more

  • Capability to spread data across multiple sandboxes.
  • Capability to add members (new versions) to the Sandboxes dimension unlike the traditional way.

 

Frequent Feature refresh from IBM:

As a part of its continuous improvement program, IBM basis its Client's business requirement and requests has been adding features to PA Workspace. IBM published "Compare Sandboxes" feature in PA Workspace in its version 2.0.31 release. Some of the features from earlier versions are Create Virtual Dimensions on fly, Calculation on the fly, Sorting, Ranking etc.

Hope you would have enjoyed reading this blog as much as I had testing this cool feature; stay tuned for upcoming blogs.

 

You may also like reading “ What is IBM Planning Analytics Local ” , “IBM TM1 10.2 vs IBM Planning Analytics”, “Little known TM1 Feature - Ad hoc Consolidations”, “IBM PA Workspace Installation & Benefits for Windows 2016”.

 

For more Information: To check on your existing Planning Analytics (TM1) entitlements and understand how to upgrade to Planning Analytics Workspace (PAW) reach out to us at info@octanesolutions.com.au for further assistance.

Octane Software Solutions Pty Ltd is an IBM Registered Business Partner specialising in Corporate Performance Management and Business Intelligence. We provide our clients advice on best practices and help scale up applications to optimise their return on investment. Our key services include Consulting, Delivery, Support and Training.

Octane has its head office in Sydney, Australia as well as offices in Canberra, Bangalore, Gurgaon, Mumbai, and Hyderabad.

To know more about us visit, OctaneSoftwareSolutions.

Learn how to create Sandbox via PAW

Line

IBM PA Workspace Installation & Benefits for Windows 2016

Mode_Comment_Icon_black0
Alarm_Icon_18 min

In this weeks blog, our team make it our duty to shine some light on right tool your company can choose to meet your business needs. We will cover the much discussed benefits of installing and using IBM's Planning Analytics workspace on Windows 2016.

Contents
    1. What is PA and PAW
    2. PAW and Windows 2016
    3. Benefits on PAW on Windows 2016

 

 

What is PA and PAW?

IBM_PA_Questions

Planning Analytics ("PA") is the next version, or can be considered as the next generation of TM1. IBM enhanced TM1 functionality and features and embraced new tools to its' suite. This new enhanced TM1 has been rebranded as PA. IBMs PA has the best of BI, Watson, planning and analytic engines under one solution. 

IBM Cognos TM1 which required trendy data presentation and visualization capability – a demand from Clients - has now been fulfilled by IBM. IBM has embedded Workspace to Planning Analytics, which is now called Planning Analytics Workspace ("PAW"). The below diagram gives a high level picture of new Architecture.

The following diagram shows where PAW fits into your IBM PA Local architecture:

Planning_Analytics_Workspace_LAN
 
 

IBM PA Workspace

As a new face of the Cognos TM1/IBM Planning Analytics solution, IBM's PAW delivers a rich, interactive user interface where you can easily build analytical and planning applications or dashboards by combining cube views, web sheets, scorecards, and data visualizations. It is flexible enough to export to excel and publish the same data to workspace. And, you can now create dynamic reports via active forms for planning and budgeting, and gain insight and discovery via data visualisations from a very clean UX workspace and dashboard – even easily share content, reports and dashboards between users. 

Highly Clean and Visual View-Based Interface: The interface is a highly visual, freeform design with over 25 charts, scorecards, images, shapes and many other options. You can easily synchronize data between different objects (cube views, web sheets etc.)

On Premise and On Cloud: An amazing innovation is Planning Analytics Workspace (PAW) and its two variants. One, being on premise called Local. And the other, is cloud based referred as on Cloud.

It is a data rich tool, that can show data from all different cubes, to meet business needs: PAW is also view-based, which means you can import multi-dimensional data into a workspace and then it effectively converts and displays it as a chart to map your visualisations (whether it is a bar graph, scatter graph, line graph or so forth).

Easily share between hundreds of users: PAW is a highly interactive viewer that makes it easy to swap data exploration to charts. The main advantage of a drag and drop workspace, unlike some other platforms that require you to write the rule for reports yourself, is that business users can easily build and share their own reports and dashboards.

Support Analysis, Reporting and Write-back features: So for example if you require write-back or what-if analysis of data, or need the aggregate data from high-volatility APPs in real-time, users can benefit from PAs consistent performance and tightly controlled latency of cached and non-cached data.

Mobile Compatibility: Workspace is mobile and can be accessed from tablets and iPads! Aside from supporting all your web interfaces such as Chrome, Safari, Explorer and Mozilla, you can even access work on the go from meeting to meeting, with mobile compatibility access. This full service analytics solution supports faster loading and scrolling for both web and cloud interfaces that enables high performance across WAN wireless access networks, and has iPad mobile compatibility.

Fast Querying and Loading Time: A T1 debugger in PA now uses a TM1 server as part of the back end. This means whether you are on a web browser or cloud interface there are less latencies and errors in querying, building or visualising reports, as opposed to traditional BI engines. Loading time is two (2) to four (4) times faster.

PAW comes with the above features and more such as combing additional highly versatile visualization features.

What makes your additional visualisation plugins unique with IBM PAW is that it does not require additional add-on licenses!

 

PAW Installation

On the other hand, the installation of PAW is not as smooth sailing as a TM1 or Cognos BI installation. PAW needs an additional software called Docker to be setup before installing PAW. Once docker is installed and ready to use, the next step is PAW installation.

The following diagram shows where Planning Analytics Workspace fits into your IBM Planning Analytics Local architecture:

 

Installation Workflow

4_steps_to_installing_IBM_Planning_Analytics_Workspace

Though the installation process looks simple, this needs technical manual assistance to fix issues and make installation and configuration a success.

If not all TM1 Experts, at least 98% of technical consultants who would have tried to upgrade from TM1 to PA, whilst installing PAW, would have encountered below error. 

PAW_Error

Fixing this issue may cause delay in upgrading your system. It may also incur costs to fix this issues in non-prod environment. Once the non-prod is fixed, an outage or planned reboot, as well as BIOS changes are musts to fix the prod environment.

 

Benefits of PAW on Windows 2016

PAW for Windows 2016 has been available since June 2, 2017 ( 2.0.21).

Let us consider your clients business objectives for a second. Your companies strategy and technolocy benefits from embacing business intelligence trends that has been made available via systems such as the new PAW package support for Windows 2016 OS.

Again, docker is a must for getting PAW installed and configured on Windows 2016. The docker used here is not a native docker but a different docker called docker EE. 

 

What’s the Gain with PAW for Windows 2016 OS?

Microsoft has come long way with confounding features. The most versatile features, we would say, are:

  • Nano Server: A Nano Server boasts a 92 percent smaller installation footprint than the Windows Server graphical user interface (GUI) installation option.
  • Containers: Docker-based containers to Windows Server.
  • Linux Secure Boot: Deploy Linux VMs under Windows Server 2016 Hyper-V with no trouble without having to disable the otherwise stellar Secure Boot feature.
  • Storage Replica and ReFS: ReFS is intended as a high-performance, high-resiliency file system intended for use with Storage Spaces Direct and Hyper-V workloads.
  • Storage Spaces Direct: More affordable for administrators to create redundant and flexible disk storage.
  • Nested Virtualization: Nested virtualization refers to the capability of a virtual machine to itself host virtual machine. Nested virtualization makes sense when a business wants to deploy additional Hyper-V hosts and needs to minimize hardware costs.
  • Hyper-V Hot-Add Virtual Hardware: We can now "hot add" virtual hardware while VMs are online and running.

 

Gain for PA and PAW Administrators

  • The first and foremost advantage is that there is no need to have a Virtualize active in VMWARE.
  • As there is no need to activate Virtualization, physical server is not required instead a VM can be used for the PAW. Thus reducing not just the cost involved with money but also the time and effort must be counted.
  • Embracing new Windows OS with new features means that you can be rest assured in regards to the performance and availability of PAW – which was not the case with TM1 10x.
  • Storage Replica is an amazing feature in Windows 2016, this helps during failovers and quit turnaround at critical times.
  • All leading to reduced expenses.

 

Contact one of Octane Software Solutions specialists for an upgrade made easy today!

 

IBM_Value_Matrix 

Planning Analytics is one of the reasons that makes IBM a market leader performance management quadrant.

  • We are experts in this Cognos TM1 technology, having extensive experience in upgrading TM1 from older versions to newer Versions.
  • Migrated our client’s system from IBM TM1 9x and 10x versions to new trending technology called Planning Analytics TM1.
  • Installed Planning Analytics Workspace and integrated with Planning Analytics.
  • Installed, configured Planning Analytics with Planning Analytics Workspace on Windows 2016, other Windows OS, AIX, Linux. 

  

You may also like reading blogs “ What is IBM Planning Analytics Local ” , “IBM TM1 10.2 vs IBM Planning Analytics

 

Who are Octane Software Solutions?

Octane Software Solutions is an official IBM Business Partner. We specialise in performance management solutions including on-shore and off-shore TM1 delivery. We provide our clients advice on best practices in Business Intelligence and scaling up applications to optimise your return on investment. By working with Octane you do not need to compromise on delivery, support, expertise, and training for end-to-end solutions that are cost effective as well as competitive.

For more details, please visit: www.octanesolutions.com.au

Srinivas is a Senior Technical Consultant at Octane Software Solutions. Learn more about Srinivas via LinkedIn. 

Got a question? Shoot!

Lorem ipsum dolor sit amet, consectetur adipiscing elit, sed do eiusmod tempor incididunt ut labore et dolore magna aliqua.

Get more articles like this delivered to your inbox