<img src="https://trc.taboola.com/1278851/log/3/unip?en=page_view" width="0" height="0" style="display:none">
finding octane
Content_Cut_Icon Twitter_Brands_Icon

Planning Analytics - Cloud Or On-Premise

Mode_Comment_Icon_white0
Alarm_Icon_1_white4 min

This Blog details IBM Planning Analytics On-Cloud and On-Premise deployment options. It focusses & highlights key points which should help you make the decision; “whether to adopt Cloud Or stay on Premise” IBM Planning Analytics: As part of their continuous endeavour to improve application interface and better customer experience, IBM rebranded TM1 to Planning Analytics couple of years back ...

down-arrow-blue
Book_Open_Solid_Icon

cloudsaas-1

This Blog details IBM Planning Analytics On-Cloud and On-Premise deployment options. It focusses & highlights key points which should help you make the decision; “whether to adopt Cloud Or stay on Premise”

 

IBM Planning Analytics:

As part of their continuous endeavour to improve application interface and better customer experience, IBM rebranded TM1 to Planning Analytics couple of years back which came with many new features and a completely new interface. With this release (PA 2.x version as it has been called), IBM is letting clients choose Planning Analytics as Local SW or as Software as a Service (SaaS) deployed on IBM Softlayer Cloud.

cloud-vs-on-premise-1280x720-1

Planning Analytics on Cloud:

Under this offering, Planning Analytics system operates in a remote hosted environment. Clients who choose Planning Analytics deployed “on-cloud” can reap many benefits aligned to any typical SaaS.

With this subscription, Clients’ need not worry about software Installation, versions, patches, upgrades, fixes, disaster recovery, hardware etc.

They can focus on building business models and enriching data from different source systems and give meaning to the data they have. This by converting data into business critical, meaningful, actionable insights.

Benefits:

While not a laundry list, covers significant benefits.

  • Automatic software updates and management.
  • CAPEX Free; incorporates benefits of leasing.
  • Competitiveness; long term TCO savings.
  • Costs are predictable over time.
  • Disaster recovery; with IBM’s unparalleled global datacentre reach.
  • Does not involve additional hardware costs.
  • Environment friendly; credits towards being carbon neutral.
  • Flexibility; capacity to scale up and down.
  • Increased collaboration.
  • Security; with options of premium server instances.
  • Work from anywhere; there by driving up productivity & efficiencies.

Client must have Internet connection to use SaaS and of course, Internet speed plays major role. In present world Internet connection has become a basic necessity for all organizations.

Picture11-1

Planning Analytics Local (On-Premise):

Planning Analytics local essentially is the traditional way of getting software installed on company’s in-house server and computing infrastructure installed either in their Data Centre or Hosted elsewhere.

In an on-premise environment - Installation, upgrade, and configuration of IBM® Planning Analytics Local software components are on the Organization.

Benefits of On-Premise:

  • Full control.
  • Higher security.
  • Confidential business information remains with in Organization network.
  • Lesser vendor dependency. 
  • Easier customization.
  • Tailored to business needs.
  • Does not require Internet connectivity, unless “anywhere” access is enabled.
  • Organization has more control over implementation process.

As evident on-premise option comes with some cons as well, few are listed below.

  • Higher upfront cost
  • Long implementation period.
  • Hardware maintenance and IT cost.
  • In-house Skills management.
  • Longer application dev cycles.
  • Robust but inflexible.

On-premise software demands constant maintenance and ongoing servicing from the company’s IT department.

Organization on on-premise have full control on the software and on its related infrastructure and can perform internal and external audits as and when needed or recommended by governing/regulatory bodies.

Before making the decision, it is also important to consider many other influencing factors; from necessary security level to the potential for customization, number of Users, modelers, administrators, size of the organization, available budget, long term benefits to the Organization.

While you ponder on this, there are many clients who have adopted a “mid-way” of hybrid environment. Under which basis factors like workload economics, application evaluation & assessment, security and risk profiles, applications are being gradually moved from on-premise to cloud in a phased manned.

 

You may also like reading “ What is IBM Planning Analytics Local ” , “IBM TM1 10.2 vs IBM Planning Analytics”, “Little known TM1 Feature - Ad hoc Consolidations”, “IBM PA Workspace Installation & Benefits for Windows 2016”.

For more Information: To check on your existing Planning Analytics (TM1) entitlements and understand how to upgrade to Planning Analytics Workspace (PAW) reach out to us at info@octanesolutions.com.au for further assistance.

Octane Software Solutions Pty Ltd is an IBM Registered Business Partner specialising in Corporate Performance Management and Business Intelligence. We provide our clients advice on best practices and help scale up applications to optimise their return on investment. Our key services include Consulting, Delivery, Support and Training. Octane has its head office in Sydney, Australia as well as offices in Canberra, Bangalore, Gurgaon, Mumbai, and Hyderabad.

To know more about us visit, OctaneSoftwareSolutions.

Leave a comment

Line

Integrating transactions logs to web services for PA on AWS using REST API

Mode_Comment_Icon_black0
Alarm_Icon_15 min

In this blog post, we will showcase the process of exposing the transaction logging on Planning Analytics (PA) V12 on AWS to the users. Currently, in Planning Analytics there is no user interface (UI) option to access transaction logs directly from Planning Analytics Workspace. However, there is a workaround to expose transactions to a host server and access the logs. By following these steps, you can successfully access transaction logged in Planning Analytics V12 on AWS using REST API.

integratetranslogs-ezgif.com-optimize

Step 1: Creating an API Key in Planning Analytics Workspace

The first step in this process is to create an API key in Planning Analytics Workspace. An API key is a unique identifier that provides access to the API and allows you to authenticate your requests.

  1. Navigate to the API Key Management Section: In Planning Analytics Workspace, go to the administration section where API keys are managed.
  2. Generate a New API Key: Click on the option to create a new API key. Provide a name and set the necessary permissions for the key.
  3. Save the API Key: Once the key is generated, save it securely. You will need this key for authenticating your requests in the following steps.

Step 2: Authenticating to Planning Analytics As a Service Using the API Key

Once you have the API key, the next step is to authenticate to Planning Analytics as a Service using this key. Authentication verifies your identity and allows you to interact with the Planning Analytics API.

  1. Prepare Your Authentication Request: Use a tool like Postman or any HTTP client to create an authentication request.
  2. Set the Authorization Header: Include the API key in the Authorization header of your request. The header format should be Authorization: Bearer <API Key>.
  3. Send the Authentication Request: Send a request to the Planning Analytics authentication endpoint to obtain an access token.

Detailed instructions for Step 1 and Step 2 can be found in the following IBM technote:

How to Connect to Planning Analytics as a Service Database using REST API with PA API Key

Step 3: Setting Up an HTTP or TCP Server to Collect Transaction Logs

In this step, you will set up a web service that can receive and inspect HTTP or TCP requests to capture transaction logs. This is crucial if you cannot directly access the AWS server or the IBM Planning Analytics logs.

  1. Choose a Web Service Framework: Select a framework like Flask or Django for Python, or any other suitable framework, to create your web service.
  2. Configure the Server: Set up the server to listen for incoming HTTP or TCP requests. Ensure it can parse and store the transaction logs.
  3. Test the Server Locally: Before deploying, test the server locally to ensure it is correctly configured and can handle incoming requests.

For demonstration purposes, we will use a free web service provided by Webhook.site. This service allows you to create a unique URL for receiving and inspecting HTTP requests. It is particularly useful for testing webhooks, APIs, and other HTTP request-based services.

Step 4: Subscribing to the Transaction Logs

The final step involves subscribing to the transaction logs by sending a POST request to Planning Analytics Workspace. This will direct the transaction logs to the web service you set up.

Practical Use Case for Testing IBM Planning Analytics Subscription

Below are the detailed instructions related to Step 4:

  1. Copy the URL Generated from Webhook.site:
    • Visit siteand copy the generated URL (e.g., https://webhook.site/<your-unique-id>). The <your-unique-id> refers to the unique ID found in the "Get" section of the Request Details on the main page.

  1. Subscribe Using Webhook.site URL:
    • Open Postman or any HTTP client.
    • Create a new POST request to the subscription endpoint of Planning Analytics.
    • In Postman, update your subscription to use the Webhook.site URL using the below post request:

  • In the body of the request, paste the URL generated from Webhook.site:

{
 "URL": "https://webhook.site/your-unique-id"
}
<tm1db> is a variable that contains the name of your TM1 database.

Note: Only the transaction log entries created at or after the point of subscription will be sent to the subscriber. To stop the transaction logs, update the POST query by replacing /Subscribe with /Unsubscribe.

By following these steps, you can successfully enable and access transaction logs in Planning Analytics V12 on AWS using REST API.

Line

Tips on how to manage your Planning Analytics (TM1) effectively

Mode_Comment_Icon_black0
Alarm_Icon_13 min

Effective management of Planning Analytics (TM1), particularly with tools like IBM’s TM1, can significantly enhance your organization’s financial planning and performance management. 

TM1 newsletter

Here are some essential tips to help you optimize your Planning Analytics (TM1) processes:

1. Understand Your Business Needs

Before diving into the technicalities, ensure you have a clear understanding of your business requirements. Identify key performance indicators (KPIs) and metrics that are critical to your organization. This understanding will guide the configuration and customization of your Planning Analytics model.

2. Leverage the Power of TM1 Cubes

TM1 cubes are powerful data structures that enable complex multi-dimensional analysis. Properly designing your cubes is crucial for efficient data retrieval and reporting. Ensure your cubes are optimized for performance by avoiding unnecessary dimensions and carefully planning your cube structure to support your analysis needs.

3. Automate Data Integration

Automating data integration processes can save time and reduce errors. Use ETL (Extract, Transform, Load) tools to automate the extraction of data from various sources, its transformation into the required format, and its loading into TM1. This ensures that your data is always up-to-date and accurate.

4. Implement Robust Security Measures

Data security is paramount, especially when dealing with financial and performance data. Implement robust security measures within your Planning Analytics environment. Use TM1’s security features to control access to data and ensure that only authorized users can view or modify sensitive information.

5. Regularly Review and Optimize Models

Regularly reviewing and optimizing your Planning Analytics models is essential to maintain performance and relevance. Analyze the performance of your TM1 models and identify any bottlenecks or inefficiencies. Periodically update your models to reflect changes in business processes and requirements.

6. Utilize Advanced Analytics and AI

Incorporate advanced analytics and AI capabilities to gain deeper insights from your data. Use predictive analytics to forecast future trends and identify potential risks and opportunities. TM1’s integration with other IBM tools, such as Watson, can enhance your analytics capabilities.

7. Provide Comprehensive Training

Ensure that your team is well-trained in using Planning Analytics and TM1. Comprehensive training will enable users to effectively navigate the system, create accurate reports, and perform sophisticated analyses. Consider regular training sessions to keep the team updated on new features and best practices.

8. Foster Collaboration

Encourage collaboration among different departments within your organization. Planning Analytics can serve as a central platform where various teams can share insights, discuss strategies, and make data-driven decisions. This collaborative approach can lead to more cohesive and effective planning.

9. Monitor and Maintain System Health

Regularly monitor the health of your Planning Analytics environment. Keep an eye on system performance, data accuracy, and user activity. Proactive maintenance can prevent issues before they escalate, ensuring a smooth and uninterrupted operation.

10. Seek Expert Support

Sometimes, managing Planning Analytics and TM1 can be complex and may require expert assistance. Engaging with specialized support services can provide you with the expertise needed to address specific challenges and optimize your system’s performance.

By following these tips, you can effectively manage your Planning Analytics environment and leverage the full potential of TM1 to drive better business outcomes. Remember, continuous improvement and adaptation are key to staying ahead in the ever-evolving landscape of financial planning and analytics.

For specialized TM1 support and expert guidance, consider consulting with professional service providers like Octane Software Solutions. Their expertise can help you navigate the complexities of Planning Analytics, ensuring your system is optimized for peak performance. Book me a meeting

Line

Saying Goodbye to Cognos TM1 10.2.x: Changes in support effective April 30, 2024

Mode_Comment_Icon_black0
Alarm_Icon_12 min

In a recent announcement, IBM unveiled changes to the Continuing Support program for Cognos TM1, impacting users of version 10.2.x. Effective April 30, 2024, Continuing Support for this version will cease to be provided. Let's delve into the details.

blog (1)

What is Continuing Support?

Continuing Support is a lifeline for users of older software versions, offering non-defect support for known issues even after the End of Support (EOS) date. It's akin to an extended warranty, ensuring users can navigate any hiccups they encounter post-EOS. However, for Cognos TM1 version 10.2.x, this safety net will be lifted come April 30, 2024.

What Does This Mean for Users?

Existing customers can continue using their current version of Cognos TM1, but they're encouraged to consider migrating to a newer iteration, specifically Planning Analytics, to maintain support coverage. While users won't be coerced into upgrading, it's essential to recognize the benefits of embracing newer versions, including enhanced performance, streamlined administration, bolstered security, and diverse deployment options like containerization.

How Can Octane Assist in the Transition?

Octane offers a myriad of services to facilitate the transition to Planning Analytics. From assessments and strategic planning to seamless execution, Octane support spans the entire spectrum of the upgrade process. Additionally, for those seeking long-term guidance, Octane  Expertise provides invaluable Support Packages on both the Development and support facets of your TM1 application.

FAQs:

  • Will I be forced to upgrade?

    No, upgrading is not mandatory. Changes are limited to the Continuing Support program, and your entitlements to Cognos TM1 remain unaffected.

  • How much does it cost to upgrade?

    As long as you have active Software Subscription and Support (S&S), there's no additional license cost for migrating to newer versions of Cognos TM1. However, this may be a good time to consider moving to the cloud. 

  • Why should I upgrade?

    Newer versions of Planning Analytics offer many advantages, from improved performance to heightened security, ensuring you stay ahead in today's dynamic business environment. This brings about unnecessary risk to your application.

  • How can Octane help me upgrade?

    Octane’s suite of services caters to every aspect of the upgrade journey, from planning to execution. Whether you need guidance on strategic decision-making or hands-on support during implementation, Octane is here to ensure a seamless transition. Plus we are currently offering a fixed-price option for you to move to the cloud. Find out more here 

In conclusion, while bidding farewell to Cognos TM1 10.2.x may seem daunting, it's also an opportunity to embrace the future with Planning Analytics. Octane stands ready to support users throughout this transition, ensuring continuity, efficiency, and security in their analytics endeavours.

Line

Top 12 Planning Analytics features that you should be using in 2023

Mode_Comment_Icon_black0
Alarm_Icon_18 min

Amin Mohammad, the IBM Planning Analytics Practice Lead at Octane Solutions, is taking you through his top 12 capabilities of Planning Analytics, in 2023. These are his personal favorites and there could be more than what he is covering.

Top 12 picks of Planning Analytics

He has decided to divide his list into PAFe and PAW, as they have their own unique capabilities, and to highlight them separately. 

Planning Analytics for Excel (PAfE)

1. Support for alternate hierarchies in TM1 Web and PAfE

Starting with TM1 Set function, which has finally opened the option to use alternate hierarchies in TM1 web. it contains nine arguments as opposed to the four in SubNM adding to its flexibility. It also supports MDX expressions as one of the arguments. This function can be used as a good replacement for SubNM.

2. Updated look for cube viewer and set editor

The Planning Analytics Workspace and Cognos Analytics have taken the extra step to provide a consistent user experience. This includes the incorporation of the Carbon Design Principles, which have been implemented in the Set Editor and cube viewer n PaFe. This allows users to enjoy an enhanced look and feel of certain components within the software, as well as improved capabilities. This is an excellent addition that makes the most out of the user experience.

3. Creating User Define Calculations (UDC)

Hands down, the User Defined Calculations is by far the most impressive capability added recently. This capability allows you to create custom calculations using the Define calc function in PAFe, which also works in TM1 Web. With this, you can easily perform various calculations such as consolidating data based on a few selected elements, performing arithmetic calculations on your data, etc. Before this capability, we had to create custom consolidation elements in the dimension itself to achieve these results in PAfE, leading to multiple consolidated elements within the dimension, making it very convoluted. Tthe only downside is that it can be a bit technical for some users who use this, making it a barrier to mass adoption. Additionally, the sCalcMun argument within this function is case-sensitive, so bear that in mind. Hoping this issue is fixed in future releases.

4. Version Control utility

The Version Control utility helps to validate whether the version of Pathway you are using is compatible with the data source version of Planning Analytics Logo. If the two versions are not compatible, you cannot use Pathway until you update the software. The Version Control uses three capability or compatibility types to highlight the status of the compatibility:

  • normal
  • warning
  • blocked

Administrators can also configure the Version Control to download a specific version of Pathway when the update button is clicked, helping to ensure the right version of Pathway is used across your organization.

Planning Analytics Workspace (PAW)

5. Single Cell widget

Planning Analytics Workspace has recently added the Single Cell widget as a visualization, making it easier to update dimension filters. Before this, the Single Cell widget could be added by right-clicking a particular data point, but it had its limitations. 

One limitation that has been addressed is the inability to update dimension filters in the canvas once the widget has been added. In order to update it, one has to redo all steps, but the single widget visualization has changed this. Now, users can change the filters and the widget will update the data accordingly. This has been a great improvement as far as enhancing user experience goes.

Additionally, the widget can be transformed into any other visualization and vice versa. When adding the widget, the data point that was selected at that point is reflected in it. If nothing is selected, the top left of the first data point in the view is used to create the widget.Single cell widget

 

6. Sending email notifications to Contributors

You can now easily send email notifications to contributors with the click of a button from the Contribution Panel of the Overview Report. When you click the button, it sends out an email to the members of the group that has been assigned the task. The email option is only activated when the status is either pending approval or pending submission. Clicking the icon will send the email to all the members assigned to the group for the task.Email notification to contributors

7. Add task dependencies

Now, you can add task dependencies to plans, which allows you to control the order in which tasks can be completed. For example, if there are two tasks and Task Two is dependent on Task One, Task Two cannot be opened until Task One is completed. This feature forces users to do the right thing by opening the relevant task and prevents other tasks from being opened until the prerequisite task is completed. This way, users are forced to follow the workflow and proceed in the right order.

8. Approval and Rejections in Plans with email notifications

The email notifications meintioned here are not manually triggered like the ones in the 6th top picks. These emails are fully automated and event-based. The events that trigger these emails could be opening a plan step, submitting a step, or approving or rejecting a step. The emails that are sent out will have a link taking the user directly to the plan step in question, making the planning process easier for the users to follow.

light bulb

"The worklow capabilities of the Planning Analytics Workspace have seen immense improvements over time. It initially served as a framework to establish workflows, however, now it has become a fully matured workflow component with many added capabilities. This allows for a more robust and comprehensive environment for users, making it easier to complete tasks."

9. URL to access the PAW folder

PAW (Planning Analytics Workspace) now offers the capability to share links to a folder within the workspace. This applies to all folders, including the Personal, Favorites, and Recent tabs. This is great because it makes it easier for users to share information, and also makes the navigation process simpler. All around, this is a good addition and definitely makes life easier for the users.

10. Email books or views

The administrator can now configure the system to send emails containing books or views from Planning Analytics Workspace. Previously, the only way to share books or views was to export them into certain formats. However, by enabling the email functionality, users are now able to send books or views through email. Once configured, an 'email' tab will become available when viewing a book, allowing users to quickly and easily share their content. This option was not previously available.

11. Upload files to PA database​

Workspace now allows you to upload files to the Planning Analytics database. This can be done manually using the File Manager, which is found in the Workbench, or through a TI process. IBM has come up with a new property within the action button that enables you to upload the file when running the TI process. Once the file is uploaded, it can be used in the TI process to load data into TM1. This way, users do not have to save the file in a shared location and can simply upload it from their local desktop and load the data. This is a handy new functionality that IBM has added. Bear in mind that the file cannot be run until it has been successfully uploaded, so if the file is large, it may take time.

12. Custom themes​

Finally, improvements in custom themes. Having the ability to create your own custom themes is incredibly helpful in order to align the coloring of your reports to match your corporate design. This removes the limitation of only being able to use pre-built colors and themes, and instead allows you to customize it to your specific requirements. This gives you the direct functionality needed to make it feel like your own website when any user opens it.

That's all I have for now. I hope you found these capabilities insightful and worth exploring further.

If you want to see the full details of this blogpost. Click here

Line

Planning Analytics Audit log – Little known pitfall

Mode_Comment_Icon_black0
Alarm_Icon_12 min

The blogs brief about the challenge faced post enabling the Audit log in one of our client's environment. Once the audit log was turned on to capture the metadata changes, the Data Directory backup scheduled process started to fail.

After some investigation, I found the cause was the temp file (i.e., tm1rawstore.<TimeStamp> ) generated by the audit log by default and placed in the data directory.

The Temp file is used by audit log to record the events before moving it to a permanent file (i.e., tm1auditstore<TimeStamp>). Sometimes, you may even notice dimension related files (i.e., DimensionName.dim.<Timestamp>), and these files are generated by audit log to capture the dimension related changes.

The RawStoreDirectory is a tm1.cfg parameter related to the audit log, which helped us resolve the issue. This parameter is used to define the folder path for temporary, unprocessed log files specific to the audit log, i.e., tm1rawstore.<TimeStamp>, DimensionName.dim.<Timestamp>. If this Config is not set, then by default, these files get placed in Data Directory.

RawStoreDirectory = <Folderpath>

 

Now, let's also see other config parameters related to the audit logs

 

AuditLogMaxFileSize:

The config parameter can be used to control the maximum size audit log file to be before the file gets saved and a new file is created. The unit needs to be appended at the end of the value defined ( KB, MB, GB), and Minimum is 1KB and Maximum is 2GB; if this is not specified in the TM1 Cfg then the default value would be 100 MB.

AuditLogMaxFileSize=100 MB

 

AuditLogMaxQueryMemory:

The config parameter can be used to control maximum memory the TM1 server can use for running audit log query and retrieving the set. The unit needs to be appended at the end of the value defined ( KB, MB, GB) and Minimum is 1KB and Maximum is 2GB; if this is not specified in the TM1 Cfg then the default value would be 100 MB.

AuditLogMaxQueryMemory=200 MB


AuditLogUpdateInterval:

The config parameter can be used to control the amount of time the TM1 server needs to wait before moving the contents from temporary files to a final audit log file. The value is taken in minutes; that is, say 100 is entered, then it is taken has 100 minutes.

AuditLogUpdateInterval=100

 

That's it folks, hope you had learnt something new from this blog.

Line

Automation in TM1 using AutoHotkey

Mode_Comment_Icon_black0
Alarm_Icon_13 min

This blog explains a few TM1 tasks which can be automated using AutoHotKey. For those who don't already know, AutoHotKey is open-source scripting language used for automation.

1. Run TM1 Process history from TM1 Serverlog :

With the help of AutoHotKey, we can read each line of a text file by using loop function and content is stored automatically in an in-built variable of function. We can also read filenames inside a folder using same function and again filenames will be stored in an in-built variable. Therefore, by making use of this we can extract Tm1 process information from Tm1 Serverlog and display the extracted information in a GUI. Let’s go through the output of an AutoHotKey script which gives details of process run.1. 

  • Below is the Screenshot of output when script is executed. Here we need to give log folder and data folder a path.

    Picture1-23
  • After giving the details and clicking OK, list of processes in the data folder of Server is displayed in GUI.

    Picture2-8

    Picture3-5
  • Once list of processes are displayed, double-click on process to get process run history. In below screenshot we can see Status, Date, Time, Average Time of process, error message and username who has executed the process. Thereby showing TM1 process history in TM1 server log. 

    Picture4-4

 

2. Opening TM1Top after updating tm1top.ini file and killing a process thread

With the help of same loop function which we had used earlier, we can read tm1top.ini file and update it using fileappend function in AutoHotKey. Let’s again go through the output of an AutoHotKey script which will open Tm1top.

  • When script is executed, below screen comes up which will ask whether to update adminhost parameter of tm1top.ini file or not.

    Picture7-3
  • Clicking “Yes”, new screen comes up where new adminhost is required to be entered.

    Picture8-4
  • After entering value, new screen will ask whether to update servername parameter of tm1top.ini file or not.

    Picture7-3
  • Clicking “Yes”, new screen comes up where new servername is required to be entered.

    Picture8-4
  • After entering a value, Tm1Top is displayed. For verifying access, username and password is required

    Picture10-2
  • Once access is verified, just enter the thread id which needs to be cancelled or killed.

    Picture11-4

 

Line

Potential Data Loss: Quick Fix a must : PA Cloud and PA Local

Mode_Comment_Icon_black0
Alarm_Icon_15 min

IBM has identified a defect within the code introduced in TM1 10.2.2 Fixpack 7, part of all other releases before 2.0.9. This defect causes data loss within the cubes even after performing SaveDataAll activity with in TM1 server. Let us get into the details.

What is the defect :

Possibility of losing data even after SaveDataAll activity is performed. This defect (APAR PH19984) has been identified recently by IBM. This will only trigger when below conditions are met.srini1

  1. No-SaveDataAll : If SavedataAll not performed since TM1 Server was rebooted.
  2. Lock Contention : Lock contention specific to public subset, TI process or chore.
  3. Rollback : SavedDataAll thread rollback due to lock contention.
  4. ServerRestart : TM1 server restarts following above mentioned points.

How to Find:

To find if TM1 Server might encounter this issue, pls follow below steps.

  1. If not already enabled, enabled debug options in tm1s-log.properties.
    TM1.Lock.Exception=DEBUG         
    TM1.SaveDataAll=DEBUG
  2. Identify SaveDataAll thread, look for “Starting SaveDataAll” in tm1server.log.
  3. Check if lock contention rollback on SaveDataAll has been triggered in tm1server.log. Look for “CommitActionLogRollback: Called for thread ‘xxxxx’”, check if xxxxx is SaveDataAll thread.
  4. If “CommitActionLogRollback: Called for thread ‘xxxxx’” is found before ‘Leaving SaveDataAll critical section’ – there is high change you are prone to this defect and might cause data loss.

 

Impacted Users :

All Clients using Planning Analytics On-Cloud and On-Premise (Local) TM1 Server versions 10.2.2 Fix pack 7 and PA version 2.0 till 2.0.9.

 

How to avoid :

This can be avoided in two ways.

  1. Automate SaveDataAll ( Best practice) to happen at regular intervals, else do this manually.
  2. For PA Local users, Apply fix released by IBM on 17th December 2019, click here for more details.

 

Octane Software Solutions is a IBM Gold Business Partner, specialising in TM1, Planning Analytics, Planning Analytics Workspace and Cognos Analytics, Descriptive, Predictive and Prescriptive Analytics.

You may also like reading “ What is IBM Planning Analytics Local ” , “IBM TM1 10.2 vs IBM Planning Analytics”, “Little known TM1 Feature - Ad hoc Consolidations”, “IBM PA Workspace Installation & Benefits for Windows 2016”, PA+ PAW+ PAX (Version Conformance), IBM Planning Analytics for Excel: Bug and its Fix , Adding customizations to Planning Analytics Workspace

 

Line

Planning Analytics for Excel: Trace TI status

Mode_Comment_Icon_black0
Alarm_Icon_12 min

IBM has been recommending its users to move to Planning Analytics for Excel (PAX) from TM1 Perspective and/or from TM1 Web. This blog is dedicated to clients who have either recently adopted PAX or contemplating too and sharing steps on how to trace/watch TI process status while running process using Planning Analytics for Excel.

Steps below should be followed to run processes and to check TI process status.

1. Once you connect to Planning Analytics for Excel, you will be able to see cubes on the right-hand side, else you may need to click on Task Pane.

 
pax1

 

2. Click on the middle icon as shown below and click on Show process. This will help show all process (to which respective user has access to) in Task Pane.

 
pax2

 

3. You will now be able to see Process.

 

pax3

 

4. To check/ trace status of the process (when triggered via Planning analytics for excel) right-Click on Processes and click Active processes.

 

pax4

 

 

5. A new box will pop-up as shown below.

 
pax5

 

6. You can now run process from Task pane and check if you can track status in new box popped up in step 5.

 

pax6

 

 

7. You can now see the status of process in this box, below is a screen print that shows the for-process cub.price.load.data, process completed 4 tasks out of 5 tasks.

pax7

 

8. Below screen prints tells us if the status of TI process, they are Working , Completed and Process completed with Errors.

pax8

 

Once done, your should be able to to trace TI status in Planning Analytics for Excel. Happy Transitioning.

As I pen down my last Blog for 2019, wishing you and your dear ones a prosperous and healthy 2020.

Until next time....keep planning & executing.

 

Line

Planning Analytics Secure Gateway: Token Expiry

Mode_Comment_Icon_black0
Alarm_Icon_13 min

Before you read further, please note, this blog details secure Gateway connection used for Planning Analytics deployed “on-cloud” Software as a Service (SaaS) offering.

This blog details steps on how to renew secure gateway Token, either before or after the Token has expired.

What is IBM Secure Gateway:

IBM Secure Gateway for IBM Cloud service provides a quick, easy and secure solution for establishing link between Planning Analytics on cloud and a data source; Typically, an RDBMS source for example IBM DB2, Oracle database, SQL server, Teradata etc. Data source/s can reside either “on-premise” or “on-cloud”.

Secure and Persistent Connection:

By deploying this light-weight and natively installed Secure Gateway Client, a secure, persistent connection can be established between your environment and cloud. This allows your Planning Analytics modules to interact seamlessly and securely with on-premises data sources.

 

Picture1-22

 

How to Create IBM Secure Gateway:

Click on Create-Secure-Gateway and follow steps to create connection.

Secure Gateway Token Expiry:

If the Token has expired, Planning Analytics Models on cloud cannot connect to source systems.

How to Renew Token:

Follow below steps to renew secure gateway token.

  • Navigate to the Secure Gateway
  • Click on the Secure Gateway connection for which the token has expired.
  • Go to Details as shown below and enter number 365 (max limit) beside Expiration days. Here 365 or a year is the maximum time after which the token will expire again. Once done click update.

Picture2-7

This should reactivate your token, TIs should now interact with source system.

 

You may also like reading “ Predictive & Prescriptive-Analytics ” , “ Business-intelligence vs Business-Analytics ” ,“ What is IBM Planning Analytics Local ” , “IBM TM1 10.2 vs IBM Planning Analytics”, “Little known TM1 Feature - Ad hoc Consolidations”, “IBM PA Workspace Installation & Benefits for Windows 2016

Line

IBM Planning Analytics for Excel: Bug and its Fix

Mode_Comment_Icon_black0
Alarm_Icon_15 min

Since the launch of Planning Analytics few years back, IBM has been recommending its users to move to Planning Analytics for Excel (PAX) from TM1 Perspective and TM1 Web. As every day new users migrate to adopt PAX, it’s prudent that I share my experiences.

This blog will be part of a series where I would try to highlight and make users aware of different aspects of this migration. This one specifically details a bug I encountered during one of the projects in which our Clients was using PAX and steps taken to mitigate the issue.

 

What was the problem:

Scenario: when a Planning Analytics User triggers a process from Navigation Pane within PAX and uses “Edit parameters” option to enter value for a numeric parameter and clicks save to runs the process.

Issue:  when done this way, the process won’t complete and fail. However, instead if this was run using other tools like Architect, Perspective or TM1 Web, the process would complete successfully.

For example, let’s assume a process, cub.price.load.data takes a number value as input to load data. User clicks on Edit Parameter to enter value and saves it to run. The process fails. Refer screenshots attached.

Using PAX.

Picture1-18    Picture2-6

Picture3-4

 

Using Perspective

Picture4-3

 

What’s causing this:

During our analysis, it was found that while using PAX, when users click on Edit parameter,enter value against the numeric parameter and save it, in the backend the numeric parameter was getting converted into a String parameter thereby modifying the TI process.

As the TI was designed and developed to handle a numeric variable and not a string, a change in type of the variable from Numeric to String was causing the failure. Refer screenshots below.

 Picture5-3

When created,

Picture6-3

Once saved,

Picture7-2

What’s the fix?

Section below illustrates how we mitigated & remediated this bug.

For all TI’s using numeric parameter.

  • List down all TI’s using numeric type in Parameter.
  • Convert the “Type” of these parameters to String and rename the parameter to identify itself as string variable (best practice). In the earlier example, I called it pValue while holding numeric and psValue for String.
  • Next, within the TI in Prolog, add extra code to convert the value within this parameter back to same old numeric variable. Example, pValue =Numbr(psValue);
  • This should fix the issue.

Note that while there are many different ways to handle this issue, it best suited our purpose and the project. Especially considering the time and effort it would require to modify all effected processes.

 

Planning Analytics for Excel : Versions effected

Latest available version (as of 22ndOctober 2019) is 2.0.46 released on 13thSeptember 2019. Before publishing this blog, we spent good time in testing this bug on all available PAX versions. It exists in all Planning Analytics for Excel versions till 2.0.46.

Permanent fix by IBM:

This has been highlighted to IBM and explained the severity of this issue. We believe this will be fixed in next version of Planning Analytics for Excel release. As per IBM (refer image below), seems fix is part of the upcoming version 2.0.47.

Picture8-3 

 

You may also like reading “ Predictive & Prescriptive-Analytics ” , “ Business-intelligence vs Business-Analytics ” ,“ What is IBM Planning Analytics Local ” , “IBM TM1 10.2 vs IBM Planning Analytics”, “Little known TM1 Feature - Ad hoc Consolidations”, “IBM PA Workspace Installation & Benefits for Windows 2016”.

Line

IBM Planning Analytics Secure Gateway Client: Steps to Set-Up

Mode_Comment_Icon_black0
Alarm_Icon_17 min

This blog broaches all steps on how to install IBM Secure Gateway Client.

IBM Secure Gateway Client installation is one of the crucial steps towards setting up secure gateway connection between Planning Analytics Workspace (On-Cloud) and RDBMS (relational database) on-premise or on-cloud.

Picture1-22-1

What is IBM Secure Gateway :

IBM Secure Gateway for IBM Cloud service provides a quick, easy, and secure solution establishing a link between Planning Analytics on cloud and a data source. Data source can reside on an “on-premise” network or on “cloud”. Data sources like RDBMS, for example IBM DB2, Oracle database, SQL server, Teradata etc.

Secure and Persistent Connection :

A Secure Gateway, useful in importing data into TM1 and drill through capability, must be created using TurboIntegrator to access RDBMS data sources on-premise.

By deploying the light-weight and natively installed Secure Gateway Client, a secure, persistent and seamless connection can be established between your on-premises data environment and cloud.

The Process:

This is two-step process,

  1. Create Data source connection in Planning Analytics Workspace.
  2. Download and Install IBM Secure Gateway

To download IBM Secure Gateway Client.

  1. Login to Workspace ( On-Cloud)
  2. Navigate to Administrator -> Secure Gate

Picture2-5

Click on icon as shown below, this will prompt a pop up. One needs to select operating system and follow steps to install the client.
Picture3-3

Once you click, a new pop-up with come up where you are required to select the operating system where you want to install this client.

Picture4-2

Choose the appropriate option and click download.

If the download is defaulted to download folders you will find the software in Download folder like below.

Picture5-2

Installation IBM Secure Gateway Client:

To Install this tool, right click and run as administrator.

Picture6-2

 

Keep the default settings for Destination folder and Language, unless you need to modify.

Picture7-1

Check box below if you want this as Window Service.

Picture8-2

Now this is an important step, we are required to enter Gateway ids and security tokens to establish a secured connection. These needs to be copied over from Secure connection created earlier in Planning Analytics Workspace ( refer 1. Create Data source connection in workspace).

Picture9-2

Figure below illustrates Workspace, shared details on Gateway ID and Security Token, these needs to be copied and pasted in Secure Gateway Client (refer above illustration).

Picture10-1

If user chooses to launch the client with connection to multiple gateways, one needs to take care while providing the configuration values.

  1. The gateway ids need to be separated by spaces.
  2. The security tokens, acl files and log levels should to be delimited by --.
  3. If you don't want to provide any of these three values for a particular gateway, please use 'none'.
  4. If you want Client UI you may choose else select No.

Note: Please ensure that there are no residual white spaces.

Picture11-3

Now click Install, once this installation completes successfully, the IBM Secure Gateway Client is ready for use.

This Connection is now ready, Planning Analytics can now connect to data source residing on-premise or any other cloud infrastructure where IBM Secure Gateway client is installed.

 

You may also like reading “ Predictive & Prescriptive-Analytics ” , “ Business-intelligence vs Business-Analytics ” ,“ What is IBM Planning Analytics Local ” , “IBM TM1 10.2 vs IBM Planning Analytics”, “Little known TM1 Feature - Ad hoc Consolidations”, “IBM PA Workspace Installation & Benefits for Windows 2016”.

Line

What is IBM Watson™ Studio?

Mode_Comment_Icon_black0
Alarm_Icon_11 min

IBM Watson™ Studio is a platform for businesses to prepare and analyse data as well as build and train AI and machine learning models in a flexible hybrid cloud environment.

IBM Watson™ Studio enables your data scientists, application developers and subject matter experts work together easier and collaborate with the wider business, to deliver faster insights in a governed way.

Watch the below for another brief intro



Available in on the desktop which contains the most popular portions of Watson Studio Cloud to your Microsoft Windows or Apple Mac PC with IBM SPSS® Modeler, notebooks and IBM Data Refinery all within a single instal to bring you comprehensive and scalable data analysis and modelling abilities.

However, for the enterprise, there are also the versions of Watson Studio Local, which is a version of the software to be deployed on-premises inside the firewall, as well as Watson Studio Cloud is part of the IBM Cloud™, a public cloud platform. No matter which version your business may use you can start using Watson Studio Cloud and download a trial of the desktop version today!

Over the next 5 days, we'll ensure to send you use-cases and materials of worth for you to review at your earliest convenience. Be sure to check our social media pages for these.

Line

Moving from on-premise TM1 10.X.X to Planning Analytics on Cloud

Mode_Comment_Icon_black0
Alarm_Icon_14 min

As you plan to adopt IBM Planning Analytics cloud, it’s important to understand what it takes. This blog highlights areas you will be involved-in when you upgrade from on-premise TM1 10.x.x to Planning Analytics on Cloud.

The good thing about cloud is that it comes with TM1/PA and all of its components like Planning Analytics Workspace, TM1 Web installed and configured. Meaning lesser effort. Also, all future release upgrades are taken care by IBM keeping you up to date with the latest and greatest.  

So let’s quickly look at the steps as you set yourself up:

  1. Welcome Kit

Once the cloud servers are provisioned, you will receive a welcome kit which will have all the details related to DEV and PROD cloud environments.

This document will have things like RDP credentials, Shared folder Credentials and links for TM1 Web, Workspace and Operation Console

Note:IBM offers its clients a choice of choosing a Domain name for both production and Development. For example, http://abcdprod.planning-analytics.ibmcloud.com/and http://abcddev.planning-analytics.ibmcloud.com/

A single blank TM1 instance with the name TM1 is setup initially when the cloud server is provisioned.

  1. Secure Gateway

Create a secure gateway to establish a connection between your on-cloud planning analytics environment and your on-premises data sources. And then add a data source to a secure gateway. You will also would need to install secure gateway client and test the connection.

  1. Support Site

Register with IBM support site to raise and monitor tickets. This is a very important step as all queries related to cloud environment including creating a new instance would require a ticket to be raised.  

  1. FTP Client

Planning Analytics on Cloud includes a dedicated shared folder for storing and transferring files. You can copy files between your local computer or shared directory within your company network and the Planning Analytics cloud shared folder with a FTPS application like FileZilla.

Download, install and configure FileZilla (free FTP solution) on users’ machines, so that the users can copy and download files from planning analytics on cloud shared folder

If you have the shared path mentioned in the Sys Info cube then update the path. Or if you have hard coded the paths in the TI then I would recommend to clean up the Tis by pointing to the path mentioned in the Sys info cube.  

  1. Planning Analytics for Excel (PAX)

PAX is the new add-in, it replaces perspectives used on-premises.

Download, Install and configure PAX on users’ machine.

Note:Schedule for a PAX training before asking users to test cubes, dimensions, reports and data reconciliation activities. This, as PAX comes with new ways of doing things which require but of hand holding initially.

  1. Upgrade perspectives action buttons

Action buttons used in TM1 10.x.x needs to be upgraded to be used in Planning Analytics for Excel.

Note:Once excel report / template are upgraded, it will no longer work in perspectives. Essential to take backups of all excel reports before performing this task

 

In Summary:

  • Have a test plan to validate all the objects including security, reports and performance of TIs.
  • Take this opportunity to clean up data folder, redundant objects and cube optimisation.
  • Have a training plan in place as new features are added to PAX and PAW very frequently.
  • Keep an eye on what is new. Below are the links for PAX and PAW updates

PAX:    https://www.ibm.com/support/knowledgecenter/en/SSD29G_2.0.0/com.ibm.swg.ba.cognos.ug_cxr.2.0.0.doc/c_nfg_PAX_test.html

PAW:  https://www.ibm.com/support/knowledgecenter/en/SSD29G_2.0.0/com.ibm.swg.ba.cognos.tm1_nfg.2.0.0.doc/c_new_features_paw.html

 

We at Octane have vast & varied experience in migrating on-premise TM1 10.x.x to Planning Analytics on cloud.

Contact us at info@octanesolutions.com.auto find out how we can help.

Line

IBM Planning Analytics (TM1) Vs Anaplan

Mode_Comment_Icon_black0
Alarm_Icon_14 min

Picture15-1

IBM Planning Analytics (TM1) vs Anaplan

There has been a lot of chatter lately around IBM Planning Analytics (powered by TM1) vs Anaplan. Anaplan is a relatively new player in the market and has recently listed on NYSE. Reported Revenue in 2019 of USD 240.6M (interestingly also reported an operating loss of USD 128.3M). Compared to IBM which has a 2018 revenue of USD 79.5 Billion (there is no clear information on how much of this was from the Analytics area) with a net profit of 8.7 b). The size of global Enterprise Performance Management (EPM) is around 3.9 Billion and expected to grow to 6.0Billion by 2022. The size of spreadsheet based processes is a whopping 60 Billion (Source: IDC)

Anaplan has been borne out of the old Adaytum Planning application that was acquired by Cognos and Cognos was acquired by IBM in 2007. Anaplan also spent 176M on Sales and Marketing so most people in the industry would have heard of it or come across some form of its marketing. (Source: Anaplan.com)

I’ve decided to have a closer look at some of the crucial features and functionalities and assess how it really stacks up.

Scalability 

There are some issues around scaling up the Anaplan cubes where large datasets are under consideration (8 billion cell limit? While this sounds big, most of our clients reach this scale fairly quickly with medium complexity). With IBM Planning Analytics (TM1) there is no need to break up a cube into smaller cubes to meet data limits. Also, there is no demand to combine dimensions to a single dimension. Cubes are generally developed with business requirements in mind and not system limitations. Thereby offering superior degrees of freedom to business analyst.

For example, if enterprise wide reporting was the requirement, then the cubes may be need to be broken via a logical dimension like region of divisions. This in turn would make consolidated reporting laborious, making data slicing and dicing difficult, almost impossible.

 

Picture14-1-1  

                                                                                                                                   

Excel Interface & Integration

Love it or hate it – Excel is the tool of choice for most analyst and finance professionals. I reckon it is unwise to offer a BI tool in today’s world without a proper excel integration.  I find Planning Analytics (TM1) users love the ability to use excel interface to slice and dice, drill up and down hierarchies and drill to data source. The ability to create interactive excel reports with ability to have cell by cell control of data and formatting is a sure-shot deal clincher.

On the other hand, on exploration realized Anaplan offers very limited Excel support.

 Picture11-2Picture12-1

 

 Analysis & Reporting

In today’s world users have come to expect drag and drop analysis. Ability to drill down, build and analyze alternate view of the hierarchy etc “real-time”. However, if each of this query requires data to be moved around cubes and/or requires building separate cubes then it’s counterproductive. This would also increase the maintenance and data storage overheads. You also lose sight of single source of truth as your start developing multiple cubes with same data just stored in different form. This is the case with Anaplan due to the software’s intrinsic limitations.

Anaplan also requires users to invest on separate reporting layer as it lacks native reporting, dashboards and data visualizations.

This in turn results in,

  1. Increase Cost
  2. Increase Risk
  3. Increase Complexity
  4. Limited planning due to data limitations

IBM Planning Analytics, on the contrary offers out of the box ability to view & analyze all your product attributes and the ability to slice and dice via any of the attributes. 

It also comes with a rich reporting, dashboard and data visualization layer called Workspace. Planning Analytics Workspace delivers a self-service web authoring to all users. Through the Planning Analytics Workspace interface, authors have access to many visual options designed to help improve financial input templates and reports. Planning Analytics Workspace benefits include:

  1. Free-form canvas dashboard design
  2. Data entry and analysis efficiency and convenience features
  3. Capability to combine cube views, web sheets, text, images, videos, and charts
  4. Synchronised navigation for guiding consumers through an analytical story
  5. Browser and mobile operation
  6. Capability to export to PowerPoint or PDF

Picture13-1

Source : Planning Analytics (TM1) cube

Line

Is Your Data Good Enough for Business Intelligence Decisions?

Mode_Comment_Icon_black0
Alarm_Icon_13 min

business-intelligence-1

There’s no question that more and more enterprises are employing analytics tools to help in their strategic business intelligence decisions. But there’s a problem - not all source data is of a high quality.

Poor-quality data likely can’t be validated and labelled, and more importantly, organisations can’t derive any actionable, reliable insights from it.

So how can you be confident your source data is not only accurate, but able to inform your business intelligence decisions? It starts with high-quality software.

 

Finding the right software for business intelligence

There are numerous business intelligence services on the market, but many enterprises are finding value in IBM solutions. 

IBM’s TM1 couches the power of an enterprise database in the familiar environment of an Excel-style spreadsheet. This means adoption is quick and easy, while still offering you budgeting, forecasting and financial-planning tools with complete control.

Beyond the TM1, IBM Planning Analytics takes business intelligence to the next level. The Software-as-a-Service solution gives you the power of a self-service model, while delivering data governance and reporting you can trust. It’s a robust cloud solution that is both agile while offering foresight through predictive analytics powered by IBM’s Watson.

 

business-intelligence-3-1-1

 

Data is only one part of the equation

But it takes more than just the data itself to make the right decisions. The data should help you make smarter decisions faster, while your business intelligence solution should make analysing the data easier. 

So how do you ensure top-notch data? Consider these elements of quality data:

  • Completeness: Missing data values aren’t uncommon in most organisations’ systems, but you can’t have a high-quality database where the business-critical information is missing.
  • Standard format: Is there a consistent structure across the data – e.g. dates in a standard format – so the information can be shared and understood?
  • Accuracy: The data must be free of typos and decimal-point errors, be up to date, and be accurate to the expected ‘real-world’ values.
  • Timeliness: Is the data ready whenever it’s needed? Any delays can have major repercussions for decision-making.
  • Consistent: Data that’s recorded across various systems should be identical. Inconsistent datasets – for example, a customer flagged as inactive in one system but active in another – degrades the quality of information.
  • Integrity: Is all the data connected and valid? If connections are broken, for example if there’s sales data but no customer attached to it, then that raises the risk of duplicating data because related records are unable to be linked.

Are you looking to harness the power of your source data to make actionable business decisions? Contact Octane to find out how we can help you leverage your data for true business intelligence.

 

business-intelligence-2-1-1

 

Line

Self Service: How Big Data Analytics is Empowering Users

Mode_Comment_Icon_black0
Alarm_Icon_13 min

big-data-analytics-1

 

Smart businesses are seeking out new ways to leverage the benefits of their big data analytics programs, and the self-service model is coming up trumps. By placing the onus directly on business users, enterprises are empowering customers with insights-driven dashboards, reports, and more. But it’s not the only bonus. 

Arguably an even greater upside for organisations is that it alleviates the talent shortage that often comes with big data. With most companies only employing a handful of data experts who can deliver analytics insights to customers, the self-service model means they are freed up to concentrate on more important tasks, while allowing the masses to derive their own insights on their own terms. 

 

What are the real benefits of self service?

If nothing else, a self-service model creates a ‘democratisation’ of big data, giving users the freedom to access the data they need when they need it most: during the decision-making process.

Moreover, there’s a low cost to entry – coupled with reduced expenses thanks to freeing up data science and IT resources – and faster time to insight. When users know what they need and can change their research strategies according to new and changing demands, they become more empowered.

But it’s not all smooth sailing – giving customers the tools they need for self service is only one part of the equation. They must also be educated on the potential pitfalls.

 

big-data-analytics-2-1

 

Avoid the common hurdles

When several users have access to specific data, there’s a risk of multiple copies being made over time, thus compromising the ‘one version of truth’ and possibly damaging any insights that could be derived.

Business users unfamiliar with big data analytics are also prone to mistakes, as they may be unaware of data-preparation complexities – not to mention their own behavioural biases. 

For all these issues, however, education is the solution, which is what Ancestry.com focused on when it began encouraging self-service analytics through its new data-visualisation platform. And with 51 quintillion cells of data you can see why.

 

There’s no harm in starting small with big data analytics

Ancestry.com has over 10 billion historical records and about 10 million registered DNA participants, according to Jose Balitactac who is the FP&A Application Manager.

The old application they were using was taking hours to do the calculations.  They looked at seven different applications before deciding on IBM Planning Analytics.  

The reason they chose IBM Planning Analytics was to accommodate the company’s super-cube of data, other solutions would have required them to “break it into smaller cubes, or reduce the number of dimensions, or join members, such as business units and cost centers.” They didn’t want to do that because their processes worked.

They set up a test with IBM to time how long it took for the model to calculate and it took less than 10-20 seconds which is what they wanted. You can read more about the Ancestry.com case study here.

If you’re keen to empower your business users through a self-service model, contact Octane today to learn how we can help you harness big data analytics.

 

big-data-analytics-3-1-1

 

Line

Planning Analytics 2.0.7 release

Mode_Comment_Icon_black0
Alarm_Icon_14 min

The long-awaited release of Planning Analytics 2.0.7 is finally here!

I know a lot of you like me were eagerly awaiting this release and in particular wanting to get into the nitty-gritty details of all the documentation and testing. 

Luckily for those that are not, I have summarised it all into the below — so happy reading there's lots for you and your team to consider.

 

american-cheerful-colleagues-1432942-1

 

With this release comes some significant enhancements which I'll get to in the section below. We will also part with several items which are marked for depreciation, replaced in some shape or form.

As IBM advises: Updates to each version of IBM Planning Analytics are cumulative. If you are upgrading IBM Planning Analytics, review all updates since your installed version to plan your upgrade and application deployment.

 

This we already know... so... onto the good stuff.

 

shutterstock_558400318-1

 

Some new and exciting items to consider include:

  • Deploying a model between environments without a restart in local. Super Exciting! Also a little involved so more information can be found here.
  • Support for Windows Server 2019 
  • Websphere Liberty Profile Upgrade to version 18.0.0.4. This will require a manual change to the server.xml file for local installations only. It is to disable sending server version info in response headers. As IBM states it is not required for operations and only really informational. <webContainer disableXPoweredBy="true"/>
  • A new OptimizeClient parameter. You can opt to load private objects on server load for all, no, admin or opsadmin users.
  • Monitoring threads with the Top logger. In short, each thread status now outputs the tm1top.log where you can download the logs from IBM PLanning Analytics Administration on cloud and local. Configuration can be found here.
  • New TurboIntegrator function to run processes on their own thread. You can now use the RunProcess ti function to run Turbo Integrator (ti) in parallel on a separate thread!
  • Changes to server behaviour 
    • TM1.Mdx.Interface logger reports syntax errors only when set to DEBUG level.
    • A new RulesOverwriteCellsOnLoad config parameter which prevents cells from being overwritten on server load for rule-derived data.
  • API updates
    • Metadata updates across entities, enumerated and complex types, and actions to extended functionality with Git, Top and hiding hierarchies.
  • TM1web changes
    • Load websheets faster with a new feature flag OptimizeCssForHiddenContent
    • IFERROR excel function to traps errors in the formula and can return an alternative result. 
    • Improved cell formatting for data types such as currency, fractions, phone numbers, and others.
  • TM1web config defaults
    • ExportCellsThreshold allows you to specify the max number of sells in websheet or cube view to contain, with a new default at 1000000.
    • MaximumConcurrentExports on cloud is 3, and local is set to 4. 
    • MaximumSheetsForExport Default changed from 100 to 50.
    • WorkbookMaxCellCount Default changed from -1 to 500000.


Where as items being depreciated can be found in the Depreciation Notes.

Keep coming back for more soon on Workspace, PAX and so much more. All expected to be here shortly. So see you soon.

 

Line

Cloud Migration – The God’s Algorithm

Mode_Comment_Icon_black0
Alarm_Icon_13 min

For starters, God's algorithm is a notion originating in discussions of ways to solve the Rubik's Cube puzzle, but which can also be applied to other combinatorial puzzles and mathematical games. It refers to any algorithm which produces a solution having the fewest possible moves, the idea being that an omniscient being would know an optimal step from any given configuration (source wiki).

With the constant barrage of messaging these days, almost pushing you over to adopt cloud, does the choice between “To Move” or “Not To” almost feel like cracking the “God’s Algorithm”? Well, hopefully by the time you are done reading this, you would have a fair bit of understanding around what it takes and what you should consider.

Okay, now that we have set the context, let’s try to understand why migration to cloud has become such an imperative.

There are primarily two major considerations; one being Cost and the other Business;

 When it comes to Cost, anything and everything related to Application, Server, Storage, Network, IT, Labor, and other overheads (like space, power, cooling) are your main considerations. Main drivers of such expenses would be around Hardware & Software maintenance, its administration and compulsory skill sets (read labor).

Business consideration however is more around the efficiencies that a cloud adoption would drive, freeing up precious time, labor, effort & funds which can then be re-directed towards building an Agile Enterprise, which, responds faster to market changes & demands, can scale up or down instantly (without bothering too much about sunk costs) and thrives on Thought leadership & Innovation

With all its advantages, it does however come neatly wrapped with “small prints”, which some organisations ignore to read and which is why they fail.

 Let’s look at some of these;

  • All clouds are not equal: Public, Private or Hybrid, each one of them have associated strengths and weaknesses. It’s key that the strengths resonate well with your need-gaps and critical that weaknesses do not impede your business plans in any way.
  • Keeping a scorecard: Its essential to evaluate all existing workloads with respect to their economic, security and risk profiles. This helps in deciding which one would go first or last or just stay.
  • Fine tuning: Once you are done deciding which workloads would make it to cloud, its necessary that you fine tune them for cloud utilisation. One size does “NOT” fit all.
  • Cloud "means" Outsourcing: This is what most organisations get wrong! While cloud does help you take your “hands-off”, doesn’t mean “eyes-off” too. Lacking in-house cloud management expertise can cost dearly and result in project failures.
  • Move beyond lift & shift: “Cloud isn’t helping us much, neither is it cost effective”, we get to hear this a lot. Using cloud should not only be about cheap storage and hardware but really about what more can you do with it. Don’t get it wrong, cloud’s term licensing tends to be always costlier in the short and medium term, however, when it comes to the Total Cost of Ownership vs Total Return on Investment, Cloud “always” wins hands down.

So, coming back to where we started, is a “God’s Algorithm” out there which would ensure migration to cloud fail proof? Well, while a lot of us are still searching, philosophies are gaining good shape… here’s is one of them with lots of fan following

 

Gods algorithm

 

The key is to break your strategy into bite-size pieces. A well-planned migration along with an airtight transition approach which has a razor-sharp focus on continuous improvement almost always ensures success. After all, you would know by the time you plan whether your application or workload is worthy of a cloud move.

 Hopefully this was some good food for your thought and optimistically has helped you make your decision between “To Move” or “Not To” a bit easier.

 

Who are we?

Octane Software Solutions Pty Ltd is an IBM Registered Business Partner specialising in Corporate Performance Management and Business Intelligence. We provide our clients advice on best practices and help scale up applications to optimise their return on investment. Our key services include IBM Planning Analytics (TM1) Consulting, Delivery, Support and Training.

 Octane has its head office in Sydney, Australia as well as offices in Canberra, Bangalore, Gurgaon, Mumbai, and Hyderabad.

Click here to find out more

Got a question? Shoot!

Lorem ipsum dolor sit amet, consectetur adipiscing elit, sed do eiusmod tempor incididunt ut labore et dolore magna aliqua.

Get more articles like this delivered to your inbox