Content_Cut_Icon Twitter_Brands_Icon

Mastering TM1DISTINCT: The Smart Way to Clean Up Your MDX Subsets

Mode_Comment_Icon_white0
Alarm_Icon_1_white8 min

IBM TM1 models can get messy fast, especially when you use alternate hierarchies and elements show up in multiple places. That's where the TM1DISTINCT MDX function quietly saves the day. Instead of blindly stripping out anything that "looks" duplicated, it understands TM1's hierarchies and only removes true duplicates – the exact same member in the exact same context The Problem: Why Regular ...

down-arrow-blue
Book_Open_Solid_Icon

IBM TM1 models can get messy fast, especially when you use alternate hierarchies and elements show up in multiple places. That's where the TM1DISTINCT MDX function quietly saves the day. Instead of blindly stripping out anything that "looks" duplicated, it understands TM1's hierarchies and only removes true duplicates – the exact same member in the exact same context

The Problem: Why Regular DISTINCT Isn't Enough

Imagine you're building a dynamic subset of Products. You pull all products under All Products, then union it with a special "Focus Products" consolidation. Now some products appear twice in the raw result. With the classic DISTINCT, TM1 might collapse these duplicates in a way that hides the structure you actually care about.

This is especially problematic when working with:

  • Alternate hierarchies that place elements in multiple logical positions

  • Union operations that naturally generate overlapping sets

  • Complex dimension structures where the same leaf element has different parents

  • Dynamic subsets that combine multiple source sets.

TM1DISTINCT is smarter: it keeps the element where it appears in different meaningful places, and only cleans up genuine duplication caused by unions or repeated logic.

Understanding TM1DISTINCT vs DISTINCT

The key difference lies in context awareness:

  • DISTINCT: Removes any duplicate entries that match another entry based on the member name. This can accidentally collapse elements that appear legitimately in different branches of the hierarchy.

  • TM1DISTINCT: Removes duplicates only when they are truly identical – same element, same hierarchy path, same context. It respects the multi-hierarchical nature of TM1. 

While the existing DISTINCT function removes duplicate elements from a set, the new TM1DISTINCT function removes duplicate members only if they are truly identical, including their parent context. This distinction is important because a single element can appear as multiple members in a hierarchy if the element has different parents.

This distinction becomes critical when your dimension design intentionally places elements in multiple locations for different analytical views.

Practical Examples

Example 1: Basic Leaf-Level Filtering

TM1DISTINCT( 
TM1FILTERBYLEVEL( 
{Descendants([Product].[All Products])}, 


)
Here, you get a clean leaf-level list of products, free of accidental duplication, but still faithful to how the hierarchy is built. The function returns all leaf-level descendants while removing any technical duplicates that might arise from the query logic.

Example 2: Combining Multiple Sets (Union Scenario)

TM1DISTINCT(
{ TM1SubsetAll([Customer]) + [Customer].[Key Accounts] }
)

You end up with each real customer only once, even though "Key Accounts" is already part of the full customer list. This is where TM1DISTINCT truly shines – it preserves your intentional hierarchy structure while cleaning up the noise.

Example 3: Alternate Hierarchy Preservation

TM1DISTINCT(
TM1FILTERBYLEVEL(
{Descendants([Cost Center].[Total Company])},
0
)
)

Leaf-level cost centers under Total Company are returned, and any technical duplicates from unions or repeated selection logic are cleaned up safely. The alternate hierarchy placements remain intact.

Real-World Impact

Consider a retail company with a Product dimension that has both:

  • A Standard Hierarchy: All Products → Category → Subcategory → SKU

  • An Alternate Hierarchy: All Products → Channel → Brand → SKU

The same SKU (say, "Blue Shirt Medium") legitimately appears under both "Subcategory" and "Brand." Using DISTINCT here might collapse one of these occurrences, breaking reporting by channel. Using TM1DISTINCT keeps both occurrences because they represent different analytical contexts. 

Best Practices

  1. Use TM1DISTINCT when building dynamic subsets that combine multiple sets or work with alternate hierarchies

  2. Avoid it only for simple, single-hierarchy subsets where standard DISTINCT would work fine

  3. Combine with TM1FILTERBYLEVEL to ensure clean, context-aware filtering

  4. Test with your actual dimension structure to verify the results match business expectations.

Conclusion

TM1DISTINCT represents a maturation of MDX handling in Planning Analytics, acknowledging that TM1's rich hierarchy support requires intelligent de-duplication. By using it in your dynamic subsets, you ensure clean data without sacrificing the intentional structure your dimensions are built upon. Your business users – and your data model – will thank you for it.

References

  1. IBM Planning Analytics Documentation. (2024). TM1DISTINCT( <set> ). IBM. https://www.ibm.com/docs/en/planning-analytics/3.1.0?topic=tsmf-tm1distinct-set

  2. IBM Planning Analytics. (2024). TM1 specific MDX functions. IBM. https://www.ibm.com/docs/en/planning-analytics/2.0.0?topic=mfs-tm1-specific-mdx-functions

Leave a comment

Line

Why CIOs and CFOs Are Moving TM1 On-Prem to Planning Analytics SaaS on AWS

Mode_Comment_Icon_black0
Alarm_Icon_13 min

TM1 v11 on-prem has served organizations well for years. It’s fast, flexible, and trusted. But the expectations around enterprise platforms have changed. Today, leadership teams want lower risk, predictable costs, and systems that evolve without constant reinvestment.

That’s where Planning Analytics SaaS on AWS fits in—not as a new planning engine, but as a better way to run one.

 

You Stop Running Infrastructure

With on-prem TM1, you’re not just running a planning system—you’re running servers, storage, backups, patches, and disaster recovery. Even when nothing breaks, there’s ongoing effort and risk. 

Planning Analytics SaaS changes that. The platform is fully managed by IBM on Amazon Web Services. Availability, backups, and resilience are built in. IT teams spend less time maintaining platforms and more time supporting the business.

Costs Become Predictable

On-prem costs rarely end with licenses. Hardware refreshes, DR environments, security fixes, and upgrade projects add up quietly over time. 

SaaS replaces that with a subscription model. No capital spend, fewer surprises, and much clearer long-term cost visibility—something finance teams appreciate immediately. 

Lower Risk, Stronger Security 

In an on-prem setup, availability and security depend heavily on how much time and money the organisation can invest. 

In Planning Analytics SaaS, resilience and security are standard. High availability, encryption, and regular security updates are part of the service, not optional extras. This reduces operational risk and simplifies audits.

No More Upgrade Projects

Upgrading on-prem TM1 is disruptive, which is why many systems stay untouched for years. 

With SaaS, updates just happen. New features arrive without downtime or upgrade programs. The platform stays current without forcing the business into large, risky change initiatives.

Performance Scales When It Matters

Planning systems are pushed hardest during budgets and forecasts, but on-prem hardware is fixed year-round. 

SaaS handles peaks without permanent over-investment. Performance stays consistent during critical cycles, without IT having to guess future capacity needs. 

Faster Time-to-Value 

Standing up new environments or supporting business growth takes time when infrastructure is involved. 

With SaaS, environments are available faster, projects move more quickly, and new requirements can be supported without long lead times. This improves agility across finance and operations.

Cleaner, Modern Integration

Traditional file-based integrations are fragile and slow. 

Planning Analytics SaaS supports secure, API-based integration, making it easier to connect planning with ERP systems and cloud data platforms. This aligns better with modern enterprise data strategies.

Better Governance by Design 

SaaS comes with boundaries—no server access, no unsupported scripts, no hidden workarounds. 

While that requires adjustment, it results in cleaner architectures, fewer production issues, and stronger governance. Over time, most organisations see this as a benefit, not a limitation. 

The Bottom Line 

Moving from TM1 on-prem to Planning Analytics SaaS on AWS isn’t about changing how you plan. It’s about reducing risk, simplifying operations, and making costs and performance more predictable. 

For CIOs, it means less infrastructure and lower operational exposure. 
For CFOs, it means clearer costs, better scalability, and fewer surprises. 

In short, it’s a more modern way to run a planning platform—without losing what made TM1 valuable in the first place. 

Line

Modernising TM1: Why Cloud Migration Alone Doesn’t Solve the Problem

Mode_Comment_Icon_black0
Alarm_Icon_17 min

IBM Planning Analytics (TM1) remains one of the most powerful planning and modelling engines used by Finance teams worldwide. Yet many organisations eventually experience frustration with their TM1 environments — slow performance, painful upgrades, rising support costs, and the quiet return of Excel.

Contrary to popular belief, these challenges are rarely caused by TM1 itself.

They are symptoms of a modernisation gap.

The Hidden Drift Problem in TM1 Environments

TM1 models often start clean, efficient, and purpose-built. Over time, however, incremental changes accumulate:

  • New dimensions added without structural discipline

  • Rules layered onto legacy logic

  • TI processes expanded beyond their original design

  • Reporting logic intertwined with source data

  • Upgrade cycles deferred

  • Risk and compliance requirements

  • Upgrade tolerance

  • Internal IT capabilities

  • Cost predictability objectives

  • Metadata-driven logic instead of hard-coded processes

  • Modular cube design separating input, calculation, and reporting

  • Decoupled reporting layers

  • Performance-first feeder strategies

  • Cloud-aware security models

  • Structured change management practices

  • Issues detected late

  • Knowledge concentrated with individuals

  • Upgrades treated as disruptive events

  • Costs becoming unpredictable

What emerges is not a broken system — but a fragile one.

Performance declines. Change cycles slow. Complexity rises.

The Common (But Incomplete) Response: “Move to Cloud”

When issues surface, organisations frequently default to infrastructure decisions:

Move from on-premise to cloud
Adopt SaaS
Change hosting providers

While these shifts reduce infrastructure management overhead, they do not automatically modernise the model.

A poorly structured TM1 architecture behaves the same way regardless of where it is hosted.

Better infrastructure cannot compensate for design inefficiencies.

True TM1 Modernisation Requires Three Pillars

Sustainable TM1 environments align three interdependent areas.

1. Infrastructure Modernisation

Infrastructure choices should reflect:

Cloud platforms reduce maintenance effort — but they are only the foundation.

2. Architecture Modernisation (The Critical Lever)

Architecture modernisation is where the largest gains are realised.

Modern TM1 models typically prioritise:

Without architectural evolution, cloud migration simply relocates existing constraints.

3. Support Model Modernisation

Traditional break-fix support models introduce systemic risk:

Modern support approaches focus on:

Proactive monitoring
SLA-driven response models
Continuous optimisation
Upgrade lifecycle management
Knowledge transfer

This operating philosophy underpins Octane Blue, our proactive TM1 managed services model.

Why This Matters for Finance Leaders

Modernised TM1 environments typically deliver:

Faster budgeting and forecasting cycles
Lower data errors
Safer upgrade paths
Reduced operational friction
More predictable support costs

Most importantly, Finance teams regain time, stability, and confidence.

Final Perspective

Cloud migration is valuable — but it is not modernisation by itself.

Real TM1 modernisation redesigns how the model scales, performs, and evolves with the business.

📘 Download the TM1 Modernisation Roadmap (PDF) 
📊 Request a Free Upgrade & Risk Estimate 

 

Line

Smarter Cube Views in IBM Planning Analytics Workspace 3.1.3

Mode_Comment_Icon_black0
Alarm_Icon_19 min

Introduction 

IBM Planning Analytics Workspace 3.1.3 has made a noticeable difference to how I work with cube views every day. Instead of dragging views around and hoping I do not overwrite something important, I now use the built-in view selector to switch between different saved views in seconds.

With Undo and Redo available directly in the cube view, it feels safe to try new layouts, pivots, and filters because I can always step back if the result is not what I expected. Together, these changes reduce friction in my analysis, keep my report layouts intact, and help me move much faster from “idea” to “answer".

Part 1: From Drag-and-Drop to Smart View Selection 

Before 3.1.3 

Analysts had to drag views from the left panel onto the grid, risking accidental overwrites and losing custom filters and formatting with each swap. Finding the right view in a cluttered list of auto-generated names wasted time.  Or they can click on three dots and click on ‘Add View’

Now in 3.1.3 

The new View selector dropdown lets you switch views instantly while preserving layout, filters, and formatting. Search by keyword to find relevant views fast.

Key benefits: 

  • Layout stays intact when switching views 

  • Fast scenario comparisons (Actuals → Budget → Forecast) 

  • Searchable view list for large libraries 


Part 2: Undo and Redo for Cube Views 

Before 3.1.3 

Changes to cube explorations were permanent. Pivoting dimensions, reordering members, or adjusting filters felt risky, so modelers hesitated to experiment and often retreated to Architect.

Now in 3.1.3 

Native Undo/Redo buttons on the toolbar let you experiment safely. Move dimensions, adjust filters, and reorder members with confidence—revert instantly if needed.

A screenshot of a computer
AI-generated content may be incorrect.

What you can undo/redo: 

  • Move dimensions between context, rows, and columns 

  • Expand/collapse hierarchies 

  • Apply/remove filters 

  • Reorder dimensions 

Quick Wins 

  • For report builders: Use the view selector to build flexible reports; users switch views at runtime. 

  • For modelers: Experiment freely with Undo/Redo; refine layouts through rapid iteration. 

Conclusion 

  • Planning Analytics Workspace 3.1.3 simplifies cube view workflows with a smart view selector and Undo/Redo. These features accelerate analysis, reduce friction, and help teams transition confidently from legacy tools to modern Workspace.

  • Enabling the cube viewer switcher needs to be done for each view separately, which is a drawback. It would be good to have a universal configuration for enabling it for all the views.

     

References 

[1] IBM. What's new in modelling – 3.1.3. IBM Documentation, 2025. https://www.ibm.com/docs/en/planning-analytics/3.1.0?topic=2025-whats-new-in-modeling-313 

[2] IBM. What's new in Planning Analytics Workspace 3.1.x? IBM Documentation, 2025. https://www.ibm.com/docs/en/planning-analytics/3.1.0?topic=workspace-whats-new-in-planning-analytics 

 

 

 

 

Line

What's new in Cognos Analytics 12.1.x

Mode_Comment_Icon_black0
Alarm_Icon_121 min

Dashboards: 

Distinction between Display and Use value in dashboards: 

You can now define Display and Use values in data modules. 

The Display values are the values that you can see in a dashboard UI; the Use values are primarily for filtering logic. 

Previously, defining the Display and Use values was possible only in FM packages. This feature brings the same capability to data modules and enhances consistency across dashboards and reporting. You can interact with readable values while filters apply precise underlying identifiers. For example, you can select a Customer ID value in the dashboard UI and apply a filter that is based on the Customer Name value. 

 

Manage filter size and filter area visibility: 

You can now resize filter columns and hide filter areas to improve the arrangement and visibility of these elements in dashboards. 

For more information on resizing filter columns in the All tabs and This tab filter areas, see Resizing filters. 

For more information on hiding and reshowing the filter areas, see Hiding and showing filter areas. 

Option for users to export visualisation data to a CSV file: 

You can now allow your users to export visualisation data to a .csv file. 

To enable this feature, open a dashboard or a report that contains a visualisation, go to Properties > Advanced, and turn on the Allow users access to data option. 

When this option is active, users can open the data tray and download the .csv file from the Visualisation data tab. Enabling this feature also adds an Export to CSV button and Export to CSV icon to the toolbar. The button is visible to the users and to the editors. If you turn off this feature, the button disappears. 

Responsive dashboard layout: 

The 12.1.1 release introduces a responsive layout feature for dashboards. 

This feature enhances the authoring experience and usability across different devices by optimising the dashboard layout for various screen sizes, including mobile devices. You can also use it for grouping the content and organising visualisations. 

To use a responsive layout, go to the Responsive tab when you create a new dashboard and select one of the available templates, as seen in the following image: 

Dashboard creation screen with the "Responsive" tab selected. Several responsive layout templates are available as options for creating a new dashboard.

The responsive dashboard layout feature comes with the following key capabilities: 

  • Layout selection: 

You can now choose between responsive and non-responsive layouts when you create a new dashboard. 

  • Adaptive widgets: 

If you change the position of a panel or resize the dashboard window, the widget automatically adapts its placement and alignment. 

  • Intuitive resizing and swapping: 

Smart alignment algorithms facilitate smooth layout transitions, while an intuitive interface makes the authoring experience smoother and more efficient. 

  • Drop zones for precise widget placement: 

Each layout cell supports five drop zones: top, right, bottom, left, and center. You can use these zones for more control over widget placement. 

  • Cell deletion: 

Dashboards now differentiate between empty and populated cells for accurate deletion. 

  • Data population: 

The feature mirrors data population from the non-responsive layouts, supports drag-and-drop function, and slot item selection. If you use the copy and paste or click-add-to functions, the feature uses a smart placement logic to make sure that it adds the content to empty cells. It can also split the data between existing cells. 

  • Window resizing: 

You can now dynamically resize a dashboard and its layout automatically adapts to the new screen size. It includes transition to a single-column or two-column layouts on smaller screens for enhanced readability. 

  • Printing to PDF files: 

You can print the dashboard to a .pdf file in View mode and in the New Page mode. 

  • Nested dashboard widgets: 

You can use the nested dashboard widgets as standard widgets or as containers for grouping and organising the content. 

To successfully implement the responsive layout, you must make sure that the dashboard uses manifest version 12.1.1 or later and confirm widget boundaries by employing the layout grid. However, if the widgets do not render correctly, check the layout specification and verify the feature support. 

Secure dashboard consumption with execute and traverse permissions: 

Users can now consume dashboards with execute and traverse permissions granted to presented data, no read permission is required. 

In the previous releases of IBM® Cognos® Analytics, the read permission was required for dashboards consumption. This might cause a sensitive data compromise because dashboard consumers could edit and copy such data. 

Important: To strengthen the protection of data that you want to be consumed by other users, modify these users' permissions from Read to Execute and Traverse before you migrate to Cognos Analytics 12.1.1. 

However, the execute and traverse permissions put some restrictions on actions that can be taken by a dashboard consumer. Therefore, the consumer cannot perform the following actions: 

  1. Drill up and down 

  2. Export 

  3. Narrative insights 

  4. Navigate 

  5. Open dashboards 

  6. Paste copied widgets into another dashboard. 

  7. Pin 

  8. Save 

  9. Save as a story 

  10. See the full data set in the data tray. 

  11. Share 

  12. Switch to Edit mode. 

Personalised dashboard views: 

The 12.1.1 release comes with a new feature for simplified customisation of complex dashboard designs. 

A dashboard view is a feature that references a base dashboard, which contains your individual filters and settings. It supports the following customisation features: 

  • Filters 

  • Brushing, excluding local filters on individual visualisations 

  • Bookmarks, including the ability to set the currently selected tab 

You can create dashboard views only from an open dashboard and from within the dashboard studio, and only against saved dashboards. If the open dashboard is saved, a Save as dashboard view option appears in the save menu: 

Selecting the "Save as dashboard view" option from the save menu.

This operation works as a standard Save as operation. When the operation is complete, the original dashboard is still displayed. To access the new dashboard view, you must open it manually from the content navigation panel. 

The dashboard views have a different icon from regular dashboards. It includes an eye overlay, which is similar to a report views icon: 

A dashboard view icon has an eye overlay, which differentiates it from the regular dashboard icon.

You can customise a dashboard view by changing the brushing, filter, or bookmarks, and then saving the view. However, the dashboard view is essentially in a Consume mode, and you can't switch to the authoring mode. It also means that you can't access the metadata tree of the dashboard view or add extra filter controls to the filter dock. If you want your users to apply filters in a metadata column, you must first add that column to the base dashboard, even if you don't initially select any filter values. 

Any updates that you make to a base dashboard automatically appear in the dashboard view, except for the custom options that you define in the dashboard view itself. You can see the changes the next time that you open the dashboard view. For example, if you delete a visualisation from the main dashboard, it no longer appears in the dashboard view. 

The Save as dashboard view operation also creates a non-editable bookmark in the dashboard view. This bookmark includes the state of filters and brushing that you applied in the dashboard at the time when the dashboard view was created or last saved. When you open the dashboard view and don't select any other bookmark, this bookmark is automatically selected. 

A bookmark with the state of filters and brushing that you applied in the dashboard at the time when the dashboard view was created.

The dashboard views not only consume bookmarks from the base dashboards, but they also can have their own bookmarks. You can create them in the same way as in standard dashboards. The Cognos® Analytics UI differentiates between Shared bookmarks, so all bookmarks from the base dashboards, and My bookmarks, which are bookmarks from the dashboard view. 

A difference in the UI between "Shared bookmarks" and "My bookmarks".

If you delete the base dashboard, you can't open the dashboard view, and its entry is disabled in the content navigation. All attempts to access that dashboard view by entering its URL address directly into a browser result in an error message. Also, the Source dashboard property appears as Unavailable, for example: 

The "Source dashboard" is set to "Unavailable" because the base dashboard has been deleted.

Reporting: 

Enhanced clarity of reporting templates view: 

Release 12.1.1 enhances the user experience of navigating through report templates. 

When you open the Create a report page, it shows only templates that match the Report filter value. This change hides all Active Reports templates by default and makes only the Report templates visible. 

The "Create a report" page has the "Report" filter value applied by default.

You can use the Filter icon to customise your view. To maintain a personalised experience, Cognos® Analytics saves your selection in local storage or by using the cached value. 

This enhancement also comes with upgraded filter labels, which reflect the current filter value, for example: Showing All Templates, Showing Report Templates, or Showing Active Report Templates. 

Manage queries in the report cache: 

You can manage which data queries are included in the report cache to control report performance. 

For more information on the report cache, see Caching Prompt Data. 

For example, queries to data sources that cannot be accessed by all users, user-dependent, might degrade the report performance. 

You can exclude report performance-degrading queries from cached prompt data by setting the value of the Report cache property to No in the query property pane: 

  • In the navigation menu, click Report, then Queries in the drop-down menu. 

  • In the Queries pane, select a query. 

  • In the Properties pane, in the QUERY HINTS section, click the Report cache property. 

  • Select one of the following values: 

  • Default - the query is included in the report cache 

  • Yes - equivalent to the Default value. 

  • No - the query is excluded from the report cache. 

For multi-level queries, this value is transferred from the lowest-level to the highest-level query. 

PostgreSQL audit deployment and model: 

The 12.1.1 release comes with a new capability for enhanced auditing and reporting in environments that use PostgreSQL as the auditing database. 

You can use a dedicated Framework Manager model and a deployment package to run reports against a PostgreSQL audit database. These resources provide a structure for analysing the audit data and creating insightful reports. 

You can access the new samples in the following locations within your installation directory: 

<installation>/samples/Audit_samples/Audit_Postgres 

<installation>/samples/Audit_samples/IBM_Cognos_Audit_Postgres.zip 

To use the PostgreSQL audit samples, make sure to create a data source connection named Audit_PG. 

Master detail relationships with 11.1 visualisations: 

You can use 11.1 visualisations in master detail relationships to present details for each master query item in a consolidated, insightful way. 

For more information on master detail relationships, see Master detail relationships. 

For the 11.1 visualisations as the detail objects, you can now choose if the same automatic value range is used in all visualisation instances in a master detail relationship. You apply your choice to the Same range for all instances of the chart option. To turn this option off or on, perform the following steps: 

  • Select a visualisation for which a master details relationship is created. 

  • In the Data Set pane of this visualisation, click the data item that defines values on the value axis. 

  • In the Properties pane, under GENERAL, click the More icon 3 dots in the filter area right of the Value range property. 

  • In the Value range window: 

  • Select Computed. 

  • Turn off or on the Same range for all instances of the chart option, depending on whether you want to use in the instances the global extrema, the biggest value range of all instances, or the local extrema, the value range of each visualisation. 

Line

Agentic AI in Finance & TM1: Why Everyone’s Suddenly Talking About It

Mode_Comment_Icon_black0
Alarm_Icon_17 min

If you’re a TM1 professional and have been near the finance or FP&A world lately, you’ve probably heard the buzzword of the season: Agentic AI. 

It sounds fancy and must have wondered why suddenly everyone is talking about it, but honestly, it’s just AI that doesn’t sit around waiting for you to poke it. It does things — proactively and automatically. 

And when you mix that with platforms like IBM Planning Analytics / TM1, things start getting interesting. 

Orchestrate

 

So… What Exactly Is Agentic AI?

Imagine if your TM1 rules, processes, and chores had a brain.

Not just “if X then Y”, but something that can: 

  • Notice something’s off

  • Decide what to do

  • Do it

  • Tell you what it did

  • Learn from the outcome 

That’s, in a nutshell, agentic AI in the TM1 paradigm.

Think of it as giving your FP&A stack its own mini team member — minus the coffee breaks or the usual shenanigans that you’ve to bear with daily.

In practical terms, agentic AI can help rather than just be a buzzword decoration floating around in everyone’s LinkedIn posts or formal/informal conversations.  

I like to highlight below a few basic things - yet very important – that agentic AI is really good at doing: 

1. Automated Data Babysitting (Finally!) 

Every TM1 admin knows the pain: source system changes, missing records, late files… chaos. 

Agentic AI can: 

  • Watch data pipelines for delays
  • Fix formatting issues on the fly for your TI process
  • Alert you before the morning refresh explodes

 Basically, your nightly chore is that you just hired an assistant.

2. “Hey, Something’s Wrong” Alerts (That Make Sense)

Instead of a typical TM1 process error message that looks like it was written in 1995, agentic AI can: 

  • Spot outliers, bad allocations, weird spikes – something you would do manually otherwise
  • Compared to historical patterns
  • Tell you, in plain English, why it’s weird

Something along the lines of: 

“Hey, sales in APAC are 4x higher than normal for Mondays. It could be a missing filter. Want me to check?” 

Yes, please. 

3. Forecasting That Doesn’t Feel Like Guesswork 

Sure, TM1 can forecast, and it can predictive forecast really well. 

But agentic AI can simulate scenarios on its own and recommend the best one. 

Examples: 

  • Auto-build 20+ what-if scenarios
  • Rank them based on risk or probability
  • Push the best one straight into a cube

It’s like giving your CFO a crystal ball… a slightly nerdy one. 

4. TM1 Admin Tasks… Done Automatically 

This is the part TM1 developers love.

Agentic AI can:

  • Fix failing processes

  • Rewrite TurboIntegrator code

  • Clean up unused object

  • Suggest how to reduce the cube size

Admittedly, given it's all subjective, and it's easier said than done, but the possibilities do exist with the more quality data we can ingest and the more we can train the model. 

5. Natural Language Access to TM1 

We’ve already seen this with AI chat Assistant in PAW where instead of navigating a million cubes and views, we can prompt Planning Analytics such as, “Give me gross margin by product for Q3 vs last year and show me drivers of variance.”

And it does it a fine job.

No view-building. No subset drama. No filter pain. 

6. Real-time Decision Automation

Finance teams love workflows and agentic AI is perfect for building the workflows.

It loves automating those workflows.

  • Approve expenses based on policy

  • Kick off TM1 processes when thresholds hit

  • Trigger emails, Teams alerts, Slack actions

  • Update commentary automatically

So instead of actively entering the forecasts or budgets, the agent proactively taking steps to initiate those steps for you. 

With time, we’re only going to see more of:

  • AI agents running close cycles

  • AI agents building dashboards

  • AI agents talking to ERP, CRM, S3, APIs without humans touching integrations

  • AI agents are debugging your model while you sleep 

Why TM1 Specifically Is a Perfect Fit

As we know, TM1 is: 

  • Real-time

  • Calculation-heavy

  • Highly scriptable

  • Connected to everything

  • Used for tons of repetitive work

Which is exactly the playground where agentic AI thrives.  

Plus, TM1 developers are already half-cyborg 😉 with the stuff they automate — agents just take it further. 

 So the biggest takeaway from all of this is that Agentic isn’t coming “in the future”, it's already there! Things are definitely moving and moving at a very fast rate in this space. 

It’s already sliding into FP&A tools, APIs, planning models, and the daily grind of finance teams. If TM1 was the engine, then agentic AI is the turbocharger bolted on top. 

And yes — as a disclaimer, it might even finally stop your chore from failing at 3 AM for no reason 😉 

Line

Octane’s AI Contract Analyser & Ask Procurement Portal: Transforming Contract Review for Modern Enterprises

Mode_Comment_Icon_black0
Alarm_Icon_117 min

Procurement teams today are under pressure to move faster, reduce risk, and operate with greater transparency. Yet contract review — one of the most critical procurement responsibilities — remains slow, manual, and highly inconsistent across most organisations. 

A major enterprise client came to Octane facing exactly this challenge. Their procurement function was overwhelmed: contracts were buried in inboxes, reviews took hours, and comparing updated versions created delays and negotiation blind spots. 

Octane delivered a powerful, AI-enabled solution using IBM WatsonX Orchestrate — combining a Contract Analyser with an intelligent Ask Procurement interface. Together, these capabilities have redefined how the client manages contract intake, review, insights, and procurement intelligence at scale. 

Blog by Alan

The Business Challenge 

The client’s procurement team was experiencing significant bottlenecks: 

  1. Contract overload and inbox chaos

    Supplier agreements arrived via email and were often lost or delayed, slowing downstream purchasing decisions.  

  2. Time-consuming manual analysis

    Procurement staff could spend 1–3 hours per contract summarising content, identifying risks, and preparing commentary for stakeholders.

  3. Difficulty comparing contract versions

    Updated supplier contracts required line-by-line manual comparison, often leading to missed red flags and weaker negotiation leverage. 

  4. Limited visibility into procurement insights

    Leaders had no quick way to query procurement data, trends, supplier risks, or anomalies.

    These issues created avoidable risk, slowed procurement cycles, and stretched team capacity. 

The Octane Solution: AI-Enabled Contract Analyser + Ask Procurement 

Octane deployed a streamlined, automated solution powered by IBM WatsonX Orchestrate that addresses both contract processing and procurement intelligence. 

 AI Contract Analyser 

The analyser automatically: 

  • Captures new supplier contracts the moment they appear in email 

  • Extracts and understands contract text 

  • Summarises key clauses and obligations 

  • Identifies risks, red flags, and missing components 

  • Highlights differences between contract versions 

  • Generates a negotiation playbook 

  • Delivers insights to stakeholders instantly 

This means procurement teams no longer read contracts line-by-line — the AI does the heavy lifting. 

 Ask Procurement: AI interface for procurement intelligence 

As part of the deliverable, Octane introduced Ask Procurement, a conversational AI interface that allows users to: 

  • Query procurement data 

  • Identify spend trends 

  • Detect anomalies in contracts or vendors 

  • Access historical contract insights 

  • Surface negotiation patterns 

  • Review supplier performance indicators 

Whether it’s “Show me all suppliers with auto-renewal clauses” or “Summarise risk trends for our top five vendors,” Ask Procurement provides instant answers. 

Together, these tools create a true digital procurement co-pilot. 

The Impact for the Client 

The benefits have been significant and immediate: 

  1. Review time reduced to under a minute

    What previously took hours now happens automatically — contracts are analysed, summarised, and compared in seconds. 

  1. Reduced legal and commercial risk

    The AI produces a structured risk register, helping teams spot issues earlier and make more informed decisions. 

  1. Stronger negotiation positions

The system highlights: 

  • What changed between versions 

  • Why it matters 

  • Recommended negotiation arguments 

This gives the procurement team a consistent, data-driven advantage. 

  1. Faster procurement cycle times

    Automated intake and instant insights have removed bottlenecks, improving: 

  • Supplier onboarding 
  • Purchase approvals 
  • Contract turnaround times 
  1. No more lost contracts

    The AI automatically captures, stores, and processes every attachment. 

  1. Improved organisational intelligence

With Ask Procurement, leaders now have: 

  • Instant visibility 

  • Searchable procurement knowledge 

  • On-demand insights 

  • Clear trend analysis 

This shifts procurement from reactive to proactive. 

Why This Matters 

This project demonstrates what applied enterprise AI looks like in the real world — practical, operational, and immediately beneficial. 

It shows how organisations can: 

  • Modernise procurement without replacing systems 

  • Automate high-effort tasks with intelligent workflows 

  • Strengthen compliance and governance 

  • Provide teams with insights previously locked away in documents 

  • Use AI as an everyday digital procurement analyst 

It also reinforces that AI is not only for futuristic use cases — it is delivering meaningful value today. 

What’s Next: Full Lifecycle Automation with E-Signature 

The next extension is already underway: 
AI-driven e-signature workflows, enabling: 

  • Automated signing 

  • Routing and approval 

  • Audit trails 

  • Archiving and version control 

This will close the loop across the entire procurement lifecycle: 
Intake → Review → Insights → Decision → Signature → Storage 

Conclusion 

Octane’s AI Contract Analyser and Ask Procurement portal offer a new way forward for procurement teams looking to accelerate productivity, reduce risk, and enhance decision-making. 

By combining IBM WatsonX Orchestrate, structured AI reasoning, and deep procurement expertise, Octane has delivered a real-world, production-ready solution that transforms how contracts — and procurement intelligence — are managed at scale. 

If you'd like to explore how this could work inside your organisation, the Octane team is ready to demonstrate what’s possible. 

Line

Transform Enterprise Performance with IBM Analytics and AI Solutions

Mode_Comment_Icon_black0
Alarm_Icon_19 min

In today’s volatile business landscape, agility is no longer a competitive advantage—it’s a necessity. True agility means moving beyond fast response to actively anticipating market shifts and seamlessly aligning people, processes, and technology to act decisively. 

From Insight to Impact (1)

Enterprises possess vast troves of data, yet the ultimate differentiator is the ability to transform that data into actionable insights and automated, intelligent decisions. At Octane Analytics, we are driving this transformation across industries by evolving disconnected reporting tools into a unified, intelligent ecosystem powered by IBM's premier analytics and AI platforms. 

The Unified Framework for Intelligent Decisions 

IBM’s comprehensive suite of solutions—including Planning Analytics, Cognos Analytics, SPSS, Decision Optimisation, Controller, and Watsonx Orchestrate—delivers a connected framework that manages business performance from strategic vision through to operational execution. This integration establishes a data-to-decision continuum where insights fluidly integrate into planning, execution, and automation cycles. 

  • IBM Planning Analytics moves organisations beyond static budgeting to dynamic, driver-based forecasting and scenario modelling. 
  • IBM Cognos Analytics empowers business users with AI-driven dashboards and visualisation tools for deep insight exploration. 
  • IBM SPSS integrates statistical precision and data science into business planning, ensuring predictions are rooted in reliable data, not intuition. 
  • IBM Decision Optimisation models complex business scenarios to identify the most efficient and optimal outcomes. 
  • IBM Controller simplifies and automates financial consolidation, closing, and regulatory reporting. 
  • IBM Watsonx Orchestrate enables non-developers to automate repetitive workflows, directly connecting insights to business action without writing code. 

The Pivot to Predictive and Prescriptive Analytics 

Many organisations remain reactive, focused on analysing "what happened." The step-change in performance occurs when analytics shift to answering the crucial questions: “what will happen?” (Predictive) and “what should we do about it?” (Prescriptive). 

The integrated IBM ecosystem facilitates this critical shift: 

  1. Prediction Informs Strategy: Predictive models built in SPSS directly inform forecasts within Planning Analytics, making financial and operational plans immediately responsive to market shifts. 

  2. Prescription Optimises Action: Decision Optimisation identifies the best sequence of actions to achieve a business goal, operating within specified constraints. 

  3. Automation Operationalises Insight: Watsonx Orchestrate then automates the prescribed follow-up actions—whether triggering workflows in HR, Finance, or Operations—significantly boosting responsiveness and reducing manual workload


This synergy elevates the organisation from merely data-driven to decision-driven, where insights are not just observed but fully operationalised. 

AI and Automation: Transforming Finance and Operations 

Automation is no longer confined to the IT department. Today, modern CFOs, HR executives, and department leaders are leveraging agentic AI to offload repetitive, high-volume tasks and achieve new levels of efficiency. 

Consider the impact across key functions: 

  • Financial Performance Management: Imagine a Finance Manager who automatically receives consolidated reports prepared by the IBM Controller, reviewed with AI-assisted insights from Cognos Analytics, and validated against dynamic budget forecasts from Planning Analytics. 
  • Intelligent HR Operations: A People Leader uses Watsonx Orchestrate to streamline repetitive HR tasks—from scheduling interviews and summarising resumes to ensuring records are instantly updated across all ERP systems.

At Octane Analytics, we specialise in designing and deploying these agentic AI ecosystems, ensuring automation amplifies human capability and drives measurable outcomes.  

Why Choose Octane Analytics? 

As an IBM Gold Partner, Octane Analytics offers deep, specialised expertise in integrating and optimising IBM’s entire performance management stack. 

Our approach is centred not just on product deployment, but on measurable business outcomes: enhanced agility in planning, increased accuracy in forecasting, greater efficiency in reporting, and empowerment through automation. 

Whether your immediate need is strategic financial consolidation or a full-scale enterprise performance management overhaul, our team provides the expertise to define the roadmap, deliver the integrated solution, and ensure a demonstrable Return on Investment (ROI). 

The Future: A Connected, AI-Powered Enterprise 

The future of enterprise performance hinges on connected intelligence—an environment where AI and analytics continuously learn, adapt, and act across all business functions. 

Organisations that master this integrated, AI-first approach will not only achieve operational efficiency but also build unparalleled resilience and foresight in a rapidly changing global market. At Octane Analytics, we are committed to helping enterprises realise this future, one intelligent decision at a time. 

Let’s Build the Intelligent Enterprise Together 

If you are exploring how integrated AI, advanced analytics, and automation can significantly elevate your business performance, we invite you to connect with us. Our team can provide tailored, real-world use case demonstrations—from predictive planning to automated workflow execution—all powered by IBM’s market-leading technology. 

📩 Reach out to Octane Analytics today to schedule a discovery session. 

Line

Transforming Finance with Generative AI

Mode_Comment_Icon_black0
Alarm_Icon_13 min

In a recent project with a leading media company in Australia, we set out to demonstrate how IBM Watsonx Orchestrate can revolutionise finance operations through the power of Generative AI. The Commercial Finance team, under constant pressure to deliver timely, accurate and insight-rich reports, needed a smarter way to move beyond manual data wrangling and deliver executive-ready outputs in record time. 
 Alan blog (1)

That’s where WatsonX Orchestrate came in. Unlike traditional BI or workflow automation tools, Watsonx Orchestrate leverages Generative AI to not only automate repetitive tasks but also to interpret, contextualise and generate meaningful outputs. The result is a system that empowers both analysts and executives to act faster, with confidence, while minimising human bottlenecks. 

Automating Financial Report Generation

Key Capabilities:

  • Automated extraction, transformation and loading (ETL) of data from a data warehouse.

  • Automated generation of third-party and monthly executive summary reports.

  • AI-driven identification of key events influencing financial outcomes.

  • Analyst verification loop to ensure accuracy and compliance. 

Business Impact:

  • Reports created in minutes rather than weeks.

  • Reduced data duplication and inconsistencies.

  • Analysts free to focus on high-value strategic analysis.

  • Executives receive timely, validated insights for faster decision-making. 

Self-Service Financial Insights

Key Capabilities:

  • A bespoke AskFinance portal enabling natural language queries.

  • Secure access aligned with role-based permissions.

  • Pre-trained CFO scenarios to simulate executive decision contexts.

  • Integrated visualization tools for interactive reporting. 

Business Impact:

  • Executives gain independence in accessing financial data.

  • Real-time insights without reliance on BI analysts.

  • Streamlined reporting across departments and report types.

  • Forecasting and scenario modeling made simple, accurate and quick. 

Use Cases in Action

Producing Monthly YTD Monetisation Reports: Automating the calculations behind key metrics, seamless PowerPoint slide generation, clean, consistent reporting outputs in a standardised format. 

Delivering Monetisation Insights: Automated chart creation and AI-driven callouts, generative commentary highlighting anomalies or areas needing attention, a natural language interface to query insights and commentary directly. 

Tangible Benefits

  • ~60% ROI: Analysts reallocated to higher-value activities, reducing attrition costs.

  • ~99% efficiency gains: Manual reporting reduced to near-zero.

  • 2 weeks → 10 minutes: End-to-end report creation compressed dramatically.

  • Improved data quality: Automated reconciliation reduces inconsistencies and errors.

  • Scalability: Built to handle larger datasets and evolving financial needs. 

Beyond Media: Industry Relevance

The use case resonates strongly across industries, such as airlines, where BI Analysts and Finance teams spend significant time manually preparing and reconciling data. In one example, reliance on IBM Planning Analytics was slowing executive decision-making as stakeholders had to wait for analysts to deliver real-time data insights. 
 
Watsonx Orchestrate bridges this gap by delivering: 
Automation of complex financial workflows. 
Generative insights at scale. 
Democratisation of access to financial intelligence. 

Curious how Agentic AI could reshape your finance operations? Let’s start a conversation tailored to your requirements. 

 

Line

8 Forces Reshaping the Future of Finance

Mode_Comment_Icon_black0
Alarm_Icon_117 min

The 8 Forces Reshaping the Future of Finance – and How Agentic AI Helps CFOs Lead

Gartner has pin pointed 8 disruptive forces set to fundamentally transform the finance function. These changes—spanning technological advancements, organisational shifts, and regulatory upheavals- pose both risks and opportunities for CFOs. Success will belong to those who leverage Agentic AI, such as WatsonX Orchestrate, and Extended Planning & Analytics, like IBM Planning Analytics, to not merely adapt but to lead the transformation. Finance is standing at a critical juncture. Gartner emphasises that the role of finance is evolving from historical reporting to actively shaping the future of the business.

To lead in this new landscape, CFOs require more than automation. They need Agentic AI, like IBM Watsonx Orchestrate, to operate seamlessly across workflows and Extended Planning & Analysis (xP&A), such as IBM Planning Analytics, to serve as a unified, intelligent source for forecasting, scenario planning, and decision-making. 

Together, these platforms form a new operational foundation for finance, striking a balance between cost efficiency, agility, governance, and innovation.  

1. A Workforce of AI Agents 

The Challenge: By 2027, one-third of enterprise software will embed Agentic AI. Finance tasks once performed manually will be supervised and executed by autonomous agents, driving exponential efficiency. 

The Solution: 

  • Watsonx Orchestrate deploys AI agents that autonomously reconcile data, build “what-if” scenarios, or flag exceptions across ERP, CRM, and finance platforms. 

  • These agents don’t just predict outcomes; they act — re-routing approvals, generating reports, and escalating high-value tasks.

The Outcome: Finance staff move beyond low-value reconciliation and report prep, shifting their time to strategy, storytelling, and insight creation. 

2. Machine-Dominated Decision Making 

The Challenge: By 2028, 70% of finance functions will rely on AI-powered real-time decisioning. Human-led bottlenecks will give way to AI-enhanced scenario modelling and automated choices.

The Solution: 

  • Planning Analytics creates driver-based models that focus on variables that truly move the business (e.g., unit margins, demand drivers, or tariff costs). 

  • Watsonx Orchestrate translates these models into actions, running multiple scenarios in parallel and surfacing recommendations with governance and audit trails. 

The Outcome: CFOs can make confident decisions faster — automating routine trade-offs while freeing analysts to stress-test strategy. 

3. Rise of Do-It-Yourself Tech 

The Challenge: Low-code and no-code platforms will see $41B in spend by 2028, enabling finance to become digitally self-sufficient. 

The Solution:

  • Planning Analytics provides a governed sandbox for FP&A teams to run ad-hoc models, ensuring agility without fragmenting data integrity. 

  • Watsonx Orchestrate acts as the connective tissue, pulling insights into workflows and presenting results conversationally. 

The Outcome: True finance self-sufficiency — teams empowered to experiment and run scenarios, without losing enterprise-wide consistency. 

4. The End of Transactional Customisation 

The Challenge: By 2030, most finance functions will converge on identical transactional processes. Differentiation will come from insights and agility, not customisation. 

The Solution: 

  • Watsonx Orchestrate automates repetitive, non-differentiating processes (invoice matching, close cycles, reconciliations). 

  • Planning Analytics ensures finance value lies in insight and foresight, not transactions — embedding real-time planning across the enterprise. 

The Outcome Finance becomes a growth engine, not a cost centre, investing resources in innovation and transformation rather than maintenance. 

5. The Lonely Enterprise 

The Challenge: Self-service tech adoption (20–50% penetration in 2 years) will push analysis out of finance and into the business. 

 The Solution: 

  • Planning Analytics creates a living model of assumptions, policies, and KPIs.

  • Watsonx Orchestrate enables agents to auto-generate compliance reports, simulate regulatory impacts, and escalate issues proactively. 

The Outcome: CFOs can stay ahead of regulators, ensuring confidence in disclosures and agility in response, without ballooning compliance costs.

6. Maximally Matrixed Organisations 

The Challenge: By 2030, large enterprises will become increasingly matrixed — characterised by complex reporting lines, distributed decision-making, and cross-functional dependencies. While this model allows global scale, it comes at a cost: decision-making slows down, bottlenecks multiply, and finance often becomes the bottleneck rather than the enabler. Gartner predicts a significant reduction in corporate decision speed due to this complexity. 

How CFOs Stay Agile with IBM

  • Watsonx Orchestrate cuts across silos by deploying AI agents that integrate data from disparate systems (ERP, CRM, HR, supply chain). These agents autonomously synthesise inputs, flag bottlenecks, and propose actions without waiting for endless email chains or manual escalations.

  • Planning Analytics provides a single source of truth across geographies and business units, enabling finance teams to run real-time, driver-based scenarios that reflect the complexities of a matrixed structure.

The Outcome: CFOs regain speed and agility. Instead of being trapped in the complexity of governance and approvals, decisions are powered by cross-system insights, actionable in minutes rather than weeks. Finance evolves into the “accelerator” in a maximally matrixed enterprise.

7. The Finance Talent Crash

The Challenge: The finance profession is heading toward a talent crunch. Demand for digital, analytical, and AI skills is skyrocketing, but the supply of finance professionals with this hybrid capability is scarce. Meanwhile, much of finance talent remains locked in repetitive tasks like reconciliations, reporting, and compliance — jobs that do little to attract or retain the next generation. 

How IBM & Octane Mitigate the Crash

  • Agentic AI (Watsonx Orchestrate) automates routine, manual workflows such as reconciliations, reporting prep, and document processing. By doing so, it frees scarce talent to focus on strategic work: forecasting, scenario planning, and advising the business.

  • Planning Analytics amplifies finance professionals’ value by equipping them with tools to run advanced models, predictive forecasts, and multi-scenario analysis.

  • Octane’s AI Adoption Workshops (delivered in partnership with IBM) provide hands-on reskilling for FP&A teams. These workshops ensure finance professionals transition from “spreadsheet operators” to strategic analysts who understand both the business and the AI tools that power it. 

The Outcome: CFOs can do more with less. Talent is not just retained but re-energised, focused on high-value activities that align with business growth. The talent gap becomes an opportunity: finance professionals become champions of digital transformation rather than casualties of automation.

8. The Era of Discontinuous Regulatory Change

The Challenge: Regulatory landscapes are evolving faster than ever. From ESG disclosures to cross-border tax regimes and industry-specific compliance requirements, CFOs face a constant barrage of discontinuous, unpredictable regulatory changes. Manual compliance frameworks can no longer keep pace, exposing firms to risk and spiralling costs of control. 

How Watsonx Orchestrate & Planning Analytics Support

  • Watsonx Orchestrate embeds governance and compliance into every workflow. AI agents automatically generate audit trails, monitor transactions for anomalies, and escalate risks before they become issues. Instead of building compliance after the fact, governance becomes native and continuous.

  • Planning Analytics enables finance to run regulatory impact scenarios in real time — modeling, for example, how a new ESG disclosure requirement might affect capital allocation or how new tax rules impact profitability by geography.

  • Combined, they give CFOs the ability to adapt instantly, ensuring compliance while keeping costs under control. 

The Outcome: Regulatory change becomes less of a disruption and more of a strategic advantage. CFOs can demonstrate resilience to boards and regulators, protecting reputation while ensuring agility. 

Adaptive Scenario Planning: Why This Matters Now

The real battleground for CFOs is scenario planning. Traditional methods are too slow for today’s volatility. Adaptive approaches — powered by AI — allow finance leaders to: 

  • Run rolling forecasts updated daily, not quarterly.

  • Build driver-based models that respond instantly to tariffs, FX rates, or demand shocks.

  • Generate multiple scenarios in real time and attach clear contingency playbooks.

  • Show investors not just one “answer,” but a strategic range of preparedness.

Here’s where the synergy between Planning Analytics and Watsonx Orchestrate is critical:

  • Planning Analytics ensures the data model, drivers, and assumptions are clean, integrated, and ready for real-time updates.

  • Watsonx Orchestrate enables CFOs to simply ask, “How does a 5% tariff change impact margin by region?” and instantly receive scenario outputs — plus trigger next steps (e.g., adjust budgets, reschedule supplier contracts). 

The CFO’s Leadership Imperative 

The forces reshaping finance — from matrixed complexity to talent shortages to regulatory turbulence — are daunting. But they also present a unique opportunity. CFOs who embrace Agentic AI today won’t just adapt to disruption; they’ll lead it. 

With IBM Watsonx Orchestrate (Agentic AI) and IBM Planning Analytics (xP&A), the Office of Finance can: 

  • Automate: Cut month-end close cycles by 3× while reducing manual errors.

  • Anticipate: Run real-time “what-if” scenarios with confidence, powered by driver-based models.

  • Adapt: Stay compliant amid discontinuous regulatory change with embedded audit trails and anomaly detection.

  • Amplify: Re-deploy scarce finance talent into strategic, growth-focused roles. 

The message is clear: The 8 forces will reshape finance — but with Agentic AI, CFOs can lead the disruption, not be disrupted. 

The Payoff: Efficiency Meets Innovation

When finance leaders integrate these technologies, the results are dramatic:

  • 99% faster reporting – weeks of manual effort compressed into minutes.

  • 3× faster close cycles – freeing capacity for forward-looking analysis.

  • 60% ROI in Year One – cost savings plus strategic impact.

  • Cultural transformation – finance staff moving from routine tasks to high-value thinking: experimentation, scenario testing, and strategic advising. 

Why Partner with Octane

Transformation isn’t just about technology; it’s about execution. That’s where Octane makes the difference, you’ll hear how leaders from IBM, Rinnai Australia, and Octane are already using AI to unlock efficiency, cut manual reporting by 40+ hours a week, and even accelerate M&A integration. Watch the recording: 

  • AI Adoption Workshops: Delivered in partnership with IBM, Octane’s workshops provide hands-on reskilling for FP&A teams. These ensure finance professionals transition from “spreadsheet operators” to strategic analysts who understand both the business and the AI tools that power it.
  • Fixed-Price Upgrade Offer: Octane can modernise your xP&A platform on a fixed-price basis after just a 2-hour technical workshop with your team.
  • AI in Finance Use Cases: In parallel, after a 2-hour strategic workshop with your finance leadership, Octane will deliver two AI use cases tailored to your business — so you see tangible value in weeks, not months. 

CFOs are no longer just guardians of cost, they are champions of transformation.  

With Watsonx Orchestrate and Planning Analytics, powered by Octane’s delivery expertise, you can accelerate value in 6–8 weeks: modernise your platform, reskill your teams, and embed AI use cases that pay back immediately. 

Bring your own Use Case 

Bring to life your own use case that generates business value to your organisation with the help of our team of AI experts. 

 Talk to us!

 

 


 

Line

IBM Planning Analytics: Debugging and database explorer updates

Mode_Comment_Icon_black0
Alarm_Icon_12 min

IBM Planning Analytics has introduced new features that make development and administration tasks much easier. Two of the most impactful improvements are the ability to see variable values while debugging TI processes and the enhanced Database Explorer.

1. Hover to See Variable Values in TurboIntegrator Debugger

Debugging TI processes used to mean adding log statements and rerunning processes just to see variable values. With the new hover help, simply move your mouse over a variable in the debugger, and its value is displayed (e.g., sCube = 'Asset_Input').

✅ Benefit: Makes debugging much faster, eliminates extra logging, and helps you quickly confirm whether variables are behaving as expected.

Figure 1: Hovering over a variable shows its value instantly in the TI debugger

Figure 1: Hovering over a variable shows its value instantly in the TI debugger

2. Database Explorer: A Smarter Way to Navigate Your Environment

Managing a Planning Analytics server or instance often involves checking how many objects exist—be it cubes, dimensions, processes, chores, or control objects. Previously, administrators and developers had to dig through folders or rely on TI scripts to gather this information. Now, with the Database Explorer, everything is accessible in one clean interface.

Key features include:

  • Quick Object Counts: Instantly see how many cubes, dimensions, processes, and chores are available.
  • Process Data Source Types: Displays what data source a process is using (e.g., Cube, ODBC, or 'No data source').
  • Organised View: Objects are grouped into categories, reducing clutter and making navigation straightforward.
  • Centralised Actions: Access logs, import/export, manage users, refresh security, or check server version from one place.

✅ Benefit: The Database Explorer improves transparency and efficiency, helping both administrators and developers work faster by providing a unified view of objects and their data sources.

Figure 2: Navigation through the Database Explorer menu

Figure 3: Object counts displayed in Database Explorer

Figure 4: Shows Data source when clicked on Processes

Small Changes, Big Impact

These updates may seem minor, but they greatly improve user productivity. From instantly checking variable values in debugging to exploring databases more efficiently, IBM Planning Analytics is now smarter and more user-friendly.

 

 

Line

Why IBM ILOG CPLEX still Leads the way in 2025

Mode_Comment_Icon_black0
Alarm_Icon_13 min

In an age where AI is often synonymous with machine learning, one of the most powerful, but often overlooked, tools in the AI toolbox is optimisation. At the heart of many real-world, high-stakes decisions lies a mathematical engine built to deliver the best possible outcome. And IBM ILOG CPLEX continues to be that engine of choice. 

As we move deeper into 2025, one hot trend is hybrid AI, the combination of predictive models with prescriptive optimisation. Why predict what might happen if you can also decide what should happen? That’s where CPLEX shines.

Real-world impact: From supply chains to smart grids 

Whether it’s dynamically routing fleets, allocating resources under uncertainty, or scheduling energy consumption during peak hours, organisations are leveraging CPLEX not just as a solver, but as a strategic decision engine.

Here are a few standout use cases:

  • Retail & E-Commerce: Predicting customer demand using ML, then using CPLEX to optimise fulfilment across a decentralised warehouse network. 

  • Utilities: Combining real-time sensor data with CPLEX-based scheduling to balance load in smart grid systems. 

  • Finance: Creating portfolio allocations that meet regulatory requirements and maximise return, all while adapting to market volatility. 

Cloud-native optimisation: Scaling with IBM Cloud Pak for Data

Another major shift? Optimisation in the cloud. IBM's Cloud Pak for Data is helping companies operationalise CPLEX models in ways that were unimaginable just a few years ago. Think seamless integration with data lakes, real-time dashboards, and API-first deployment models.

Why It Matters

As business environments grow more complex, the ability to make data-driven, optimal decisions in real-time becomes a true competitive advantage. CPLEX brings mathematical certainty to uncertain times, and when combined with machine learning, it offers a full-spectrum AI approach that’s both predictive and prescriptive. 

Are you exploring optimization as part of your AI strategy? Let’s connect, happy to exchange thoughts on where prescriptive analytics is heading next.

💬Talk to us: media@octanesolutions.com.au

 

Line

A Developer’s guide: Avoiding File Lock conflicts in TI with AsciiOutput

Mode_Comment_Icon_black0
Alarm_Icon_13 min

What is ASCIIOutputOpen

In IBM Planning Analytics (TM1), TurboIntegrator (TI) processes are essential for automating data operations. One of the most useful functions in TI scripting is ASCIIOutputOpen, which allows you to open a file for writing ASCII data. Whether you need to create a new file or append data to an existing one, this function provides the flexibility to control file access and modifications efficiently.

 

Key Features of ASCIIOutputOpen

  • Append or Overwrite: Choose whether to overwrite an existing file or add new data to the end.  

  • Shared Read Access: Enable other processes or users to read the file while it’s being written.  

  • Supports Multiple File Types: Works seamlessly with .csv and .txt files, making it ideal for various data export needs.  

Syntax Breakdown 

The basic syntax for ASCIIOutputOpen is:  


ASCIIOutputOpen(FileName, OpeningMode);

Parameters Explained 

  1. FileName  

    • The full path and filename (including extension) where data will be written.  

    • Example: "C:\Data\Report.csv"  

  2. OpeningMode  

    • A numeric code that determines how the file is accessed.  

Mode

Description

Behaviour

0

Overwrite without shared read access

Creates or overwrites the file; no other process can read simultaneously.

1

Append mode without shared read access

Adds data to the end of the existing file; no sharing.

2

Overwrite, shared read access enabled

Overwrites if the file exists; allows other processes to read concurrently.

3

Append, shared read access enabled

Adds data to the end; allows other processes to read the file simultaneously.

 

Related Functions 

For more granular control, you can also use:  

  • FILE_OPEN_APPEND() – Opens a file in append mode.

  • FILE_OPEN_SHARED() – Opens a file with shared read access.

Combining these functions can provide finer control over file operations. 

Practical Examples

Example 1: Overwriting a File with Shared Read Access  

If you want to generate a new CSV report (overwriting any existing version) while allowing others to read it:  

ASCIIOutputOpen("C:\\Reports\\SalesData.csv", 2);

  

Example 2: Appending Data with Shared Access  

If you need to add new records to an existing file without locking it:  

ASCIIOutputOpen("C:\\Reports\\SalesData.csv", 3);

Conclusion 

ASCIIOutputOpen is a powerful function in TurboIntegrator that helps manage file exports efficiently. By understanding its different modes, you can ensure seamless data operations—whether you're generating reports, logging data, or integrating with external systems.  

Pro Tip: Always verify file paths and permissions before running TI processes to avoid errors!  

Have you used ASCIIOutputOpen in your projects? Share your experiences in the comments! 🚀  

Line

Enabling and configuring alerts for IBM Planning Analytics application and server

Mode_Comment_Icon_black0
Alarm_Icon_16 min

Looking for ways to monitor the health and status of your IBM Planning Analytics (PA) Application and Server?

Here are some methods for automating monitoring and receiving alerts whenever issues arise in the backend of your PA applications.

Before enabling these alerts, it's important to understand the key areas to monitor. Monitoring these aspects ensures your PA applications remain healthy, stable, and optimised for performance.

NOTE: Your Access role should be Administrator to view and perform all the below.

1. Database Health Monitoring

To assess the health of your PA applications and databases, follow these steps:

1. Log in to IBM Planning Analytics Workspace.

2. Navigate to Administration and click Databases.

3. Under Databases, select the desired PA application.

4. On the right-hand side of the page, click on Details to view the status and health metrics.

This provides a quick overview of the database’s performance and any potential issues that may require attention.

Sample Screenshot:

Planning analytics database

 

You will see various status icons that indicate the current health of the PA application. Here's what each icon represents:

 Indicates that the PA application is healthy and running without any issues.

Indicates that the PA application is at risk of moving into a critical state. Proactive attention may be required.

 Indicates that the PA application is in a critical state and may potentially
                                 lead to a system failure or downtime if not addressed immediately.

To set up automatic alerts for your PA application:

1. On the right-hand side of the application's detail page, click on Alerts.

2. From there, you can configure the threshold values that will trigger alerts based on system performance or issues.

3. To enable the Alerts, click on the respective  button, which changes to  indicate it's enabled.

This allows proactive monitoring by notifying you when predefined conditions are met.

Sample Screenshot:

5. We can define the Warning threshold and Critical threshold values based on the size and memory utilised by the PA application under stable conditions.

6. Apart from that, we have options to define Critical Threshold values for factors such as –

  • Max thread wait time: we can set the Critical Threshold for maximum thread wait time for the respective PA Application to make sure the PA instance is not slowing down as we can kill thread as soon as possible.

  • Thread in run state: we can set the Critical Threshold to make sure the threads are not in Run state for more than expected in the respective PA Application, which has the possibility of slowing down the server.

  • Database unresponsive: we can set the Critical Threshold to note if Database/PA application is not responsive which helps us to action it as soon as possible.

6.  We can enable the Database Shutdown Alert to enable notifications on the PA Application Stop/Downtime and Start/Restart activity.

7.  We can add multiple email IDs to receive the notifications of the enabled Alerts, separated by a comma (‘, ’) in the Notify email IDs text box.

8.  Click Apply to save the change made to the Alerts.

2. Agent/PA Server Health Monitoring

To assess the health of your PA Server/Agent, follow these steps;

1.    Log in to IBM Planning Analytics Workspace.
2.    Navigate to Administration and click Databases.
3.    Under Agents, select the desired Agent.
4.    On the right-hand side of the page, click on Details to view the status and health metrics.

This provides a quick overview of the database’s performance and any potential issues that may require attention.

Sample screenshot:

To set up automatic alerts for your PA Server:

1.    On the right-hand side of the application's detail page, click on Alerts.
2.    From there, you can configure the threshold values that will trigger alerts based on system performance or issues. 
3.    To enable the Alerts, click on the respective   button, which changes to   indicates it's enabled.

Sample Screenshot:

4.    We can define the Warning threshold and Critical threshold values based on the size and memory utilized by the PA application on stable conditions.
5.    We can add multiple emails IDs to receive the notifications of the enabled Alerts separated by a comma (‘ , ’) in the Notify email IDs text box.
6.    Click Apply to save the change made to the Alerts.

Line

Unlocking the future of financial planning with IBM Planning Analytics and AI assistant

Mode_Comment_Icon_black0
Alarm_Icon_17 min

The need for agile financial planning

In today’s rapidly evolving business landscape, organisations face unprecedented volatility, supply chain disruptions, fluctuating demand, inflationary pressures, and geopolitical uncertainties. Traditional financial planning methods, reliant on static spreadsheets and manual processes, are no longer sufficient. Businesses need real-time insights, predictive foresight, and the ability to pivot quickly in response to changing conditions.

Enter IBM Planning Analytics with Watson’s AI Assistant, a cutting-edge solution that combines multidimensional modelling with conversational AI. This solution is transforming how enterprises approach budgeting, forecasting, and performance management. This isn’t just an incremental improvement; it’s a paradigm shift in financial planning and analysis (FP&A).

Future of finance (1)

What is IBM Planning Analytics with AI Assistant?

IBM Planning Analytics, built on the powerful TM1 engine, has long been recognised for its:

  • In-memory computing for lightning-fast calculations

  • Multidimensional modelling for complex scenario analysis

  • Seamless Excel integration for user-friendly analytics

Now, with the AI Assistant, the platform goes beyond traditional analytics by embedding Watson-powered artificial intelligence directly into the planning workflow. This AI-driven co-pilot enables users to interact with their data conversationally, uncovering insights that would otherwise require deep technical expertise.

How does the AI assistant work?

Think of it as a data-savvy colleague who can:

  • Answer complex financial questions in natural language (e.g., *“Why did Q2 profitability decline in the Asia-Pacific region?” *)

  • Automatically detect anomalies and suggest corrective actions

  • Generate predictive forecasts based on historical trends and external factors

  • Run instant what-if scenarios (e.g., “What happens if raw material costs increase by 15%?”)

Unlike traditional BI tools that require users to write queries or build complex models, the AI Assistant democratizes analytics, making advanced insights accessible to finance teams, business leaders, and operational managers alike. 

Key Benefits of IBM Planning Analytics with AI Assistant

Natural language queries – No coding required

Gone are the days of struggling with MDX or complex formulas. Users can simply ask questions in plain English, such as:

  • “Show me sales performance by region last quarter.”

  • “Why are operating expenses higher than forecast?”

  • “Predict next quarter’s revenue based on current trends.”

The AI Assistant interprets intent, retrieves relevant data, and presents answers in interactive dashboards, charts, or drill-down reports, eliminating the need for IT intervention.

Real-time cognitive insights

The AI Assistant continuously monitors data patterns, flagging anomalies and suggesting corrective actions before they escalate into bigger issues. For example:

  • “Inventory turnover in the Northeast is 20% below target, recommend adjusting procurement orders.”

  • “Marketing spend is exceeding budget due to higher-than-expected digital ad costs.”

This proactive intelligence helps businesses stay ahead of risks and opportunities.

Instant scenario modelling & what-if analysis

Strategic planning no longer takes weeks. With AI-powered scenario modelling, finance teams can:

  •  Test multiple business conditions in seconds (e.g., “What if interest rates rise by 2%?”)

  • Compare outcomes side-by-side

  • Adjust assumptions dynamically

This capability is invaluable for risk management, capital allocation, and growth planning

Democratised analytics for cross-functional teams

The AI Assistant breaks down data silos, allowing:

  • Finance teams to explore profitability drivers

  • Sales leaders to assess pipeline impacts

  • Supply chain managers need to optimise inventory levels

By making analytics self-service, organisations reduce dependency on IT and accelerate decision-making.

Explainable AI: Not just predictions, but reasons

Many AI tools provide forecasts but fail to explain why a trend is occurring. IBM’s AI Assistant goes further by:

  • Highlighting key drivers behind variances (e.g., “Q3 revenue dipped due to delayed product launches in Europe.”)

  • Suggesting actionable recommendations (e.g., “Consider reallocating budget to high-growth markets.”)

This transparency builds trust in AI-driven insights.

Real-world use case: Transforming a CFO’s workflow

Imagine a CFO who starts their day with an AI-generated briefing:

“Good morning. Last week, operating margins in the retail division fell by 8% due to higher logistics costs. Supplier X increased rates by 12%. Recommended actions: Renegotiate contracts or explore alternative vendors. Additionally, Q4 demand forecasts suggest a 15% increase, consider ramping up production.”

This level of automated, intelligent guidance enables faster, more informed decisions, reducing planning cycles from weeks to hours.

Seamless Integration with Existing Tools

IBM Planning Analytics doesn’t operate in isolation. It integrates with:

  • Microsoft Excel (for familiar spreadsheet-based planning)

  • Power BI & Tableau (for advanced visualisations)

  • ERP systems (SAP, Oracle, NetSuite) for real-time data synchronisation

The AI Assistant acts as a universal translator, bridging gaps between disparate systems and delivering unified insights.

The future of work: Augmented, not automated

A common fear is that AI will replace human jobs. However, IBM Planning Analytics is designed to augment—not replace, FP&A teams.

  • AI handles data processing, anomaly detection, and predictive modelling.

  • Humans focus on strategy, stakeholder collaboration, and creative problem-solving.

The result? Higher productivity, deeper insights, and more strategic impact.

Is your organisation ready for AI-driven planning?

Adopting IBM Planning Analytics with AI Assistant requires:

  • A shift from manual to automated processes

  • Trust in data-driven decision-making

  • Willingness to experiment with AI-powered insights

For companies that embrace this transformation, the rewards are substantial:

  • Faster, more accurate forecasts

  • Proactive risk mitigation

  • Empowered teams with self-service analytics

Start your AI-powered planning journey

The best way to experience the power of IBM Planning Analytics with AI Assistant is to run a pilot project. Begin with a single department, finance, sales, or operations and measure the impact.

Your planning process will never be the same.

📅 Ready to explore how AI can revolutionise your financial planning? Contact us for a demo and learn more.

 

Line

Unlocking the power of IBM Planning Analytics with execute HTTP request

Mode_Comment_Icon_black0
Alarm_Icon_14 min

Today, we’re excited to explore a game-changing function that enhances the versatility of your Planning Analytics platform. Imagine a tool that not only streamlines your data processes but also connects your Planning Analytics seamlessly with external systems. This innovation allows you to execute HTTP requests directly within your TurboIntegrator (TI) processes, transforming your Planning Analytics into an integral part of your interconnected ecosystem. Join us as we delve into the possibilities this function brings and how it can elevate your data management strategies to new heights.

Image

Why Execute HTTP Request is a Game-Changer

What makes this function truly versatile is its ability to connect with any external system that supports APIs. The only limit is your imagination and the capabilities of the APIs you wish to connect to. Integrating Planning Analytics with external systems allows developers to break free from traditional limitations and extend the functionality of their applications.

This session will include practical demonstrations of several use cases that highlight the power of the Execute HTTP Request. By the end, I hope to inspire you to explore how you can leverage this function to enhance your TM1 applications and workflows.

Demo 1: Hot Promotion of Objects Between Instances

Let's dive into our first demonstration on how to perform hot promotion of objects from one instance to another. Traditionally, migrating objects between instances involved shutting down the target server. However, using the Execute HTTP Request, we can do this in real-time.

  1. Setting Up the TI Process:
     
    • Open the workbench in your workspace and create a new TI process.
    • Declare necessary constants and set your source and target instances (e.g., SmartCo to Demo Server).
  2. Use of HTTP Execute Request:
     
    • Fetch dimensions from the source instance and check for their existence in the target instance.
    • For non-existing dimensions, save them as JSON files and use the HTTP Execute Request to migrate them to the target instance.

Let’s execute this process! Once completed, you’ll see that the dimensions have been successfully migrated.

Demo 2: Executing a Process Across Instances

Next, we'll demonstrate the ability to execute a process from one instance in another:

  1. Migrate TI Processes:
     
    • Similar to dimension migration, retrieve the TI process (like Sample TI) from the source instance and save it as a JSON file.
  2. Execute the TI Process:
     
    • Use the Execute HTTP Request to trigger execution from the target instance while utilising its response to capture status codes and log outputs.

After running this process, you should see that both the processes have been migrated and executed successfully.

Demo 3: Loading Currency Conversion Rates

In this demo, we will load real-time currency conversion rates from a website using its API:

  1. Call the API:
     
    • Set up an HTTP GET request to retrieve USD conversion rates.
  2. Extract and Utilise Data:
     
    • Capture the JSON response and extract required currency rates using JSON functions.

image (2)-1

Run the process, and you will observe the real-time conversion rates being fetched and displayed.

Demo 4: Sending Teams Notifications

Next, I’ll show you how to send automated notifications to Microsoft Teams:

  1. Integrate with Microsoft Power Automate:
     
    • Set up a Power Automate flow to send notifications.
  2. Trigger Notification from System:
     
    • Use Execute HTTP Request to trigger alerts in Teams based on process execution results.

image-Feb-05-2025-07-23-13-1010-AM

After execution, you should see notifications appear in your Teams channel.

Demo 5: Sending Emails via HTTP Requests

Finally, we'll explore how to send emails:

  1. Power Automate for Email Notifications:
     
    • Again, set up Power Automate to manage email sending through appropriate HTTP requests.
  2. Dynamic Email Content:
     
    • Utilise dynamic fields for subject and body based on execution results.

After executing this process, you will receive the email in your mailbox.

Conclusion

Today, we have unlocked the extensive capabilities of the Execute HTTP Request function in IBM Planning Analytics. We showcased hot promotion between instances, cross-instance process execution, real-time data fetching, as well as integration with Microsoft Teams and email notifications. 

Thank you all for attending this session. I hope you found it beneficial and feel inspired to explore the functionality of Planning Analytics further. Let’s move toward a more integrated and dynamic future in our analytics processes!

Line

Navigating the Storm: A Double Migration

Mode_Comment_Icon_black0
Alarm_Icon_14 min

The past few weeks have been a whirlwind, a high stakes balancing act that tested the limits of our team's resilience and expertise. We simultaneously managed two major client migrations. Both were for high profile large clients and part of their IBM Planning Analytics Modernisation initiative:

TM1 newsletter (8)

The clients

News Corp - News Corp Australia tells the stories that matter to 18.2 million Australians every month as an important part of News Corp, a diversified global media and information services company. From breaking news in the morning to deciding dinner that night, Australia trusts our brands to inform, inspire and delight across the day – including The Australian, The Daily Telegraph, Herald Sun, The Courier-Mail, The Advertiser, Mercury, NT News, Townsville Bulletin, The Cairns Post, Gold Coast Bulletin, Geelong Advertiser, news.com.au, Vogue, GQ, Kidspot, taste.com.au and plenty more. More here https://www.newscorpaustralia.com/

A long-time TM1 user was looking at refreshing its application and modernise it by going to the cloud and utilising new dashboarding capabilities of the workspace and starting some testing on AI capabilities for the Finance team. TM1 is one of the core applications within the Finance team and they could not afford to run the risk of a prolonged upgrade to the cloud.

BlueScope They are a global leader in metal coating and painting products for the building and construction industries, providing vital components for houses, buildings, structures, vehicles, and more.

They have built a solid foundation for growth with a diverse portfolio of businesses in some of the largest and fastest-growing economies of the world. They are headquartered in Australia, with our people and operations spread across North America, Australia, New Zealand, the Pacific Islands, and throughout Asia. More here https://www.bluescope.com/

BlueScope is also a long-term TM1 user with the application used for a number of areas including demand planning, forecasting and Reporting in the Finance teams. They were on an older version of on–prem TM1 and upgrading to the latest version of on-prem IBM Planning Analytics TM1

A Perfect Storm

Both projects presented unique challenges. NEWS's migration required significant user training and change management, while BlueScope's upgrade involved complex technical configurations and intricate coordination with multiple stakeholders. To make matters even more challenging, both go-lives were scheduled for the same day!

Overcoming the Odds

How did we navigate this perfect storm?

  • Strong Leadership: Clear and decisive leadership was crucial in keeping the projects on track. By setting clear expectations, prioritising tasks, and making timely decisions, we were able to mitigate risks and ensure smooth execution.
  • Effective Teamwork: Our team demonstrated exceptional teamwork and collaboration. By working closely together, we could share knowledge, support each other, and address challenges proactively.
  • Agile Methodology: We adopted an agile approach, breaking down the projects into smaller, manageable phases. This allowed us to adapt to changing circumstances and deliver value incrementally. We have built up a comprehensive checklist for our upgrade TM1 upgrades and this makes upgrades easier and risk free.
  • Robust Communication: Open and transparent communication was key to keeping all stakeholders informed and aligned. Regular status updates, clear documentation, and effective problem-solving ensured a smooth transition.

Lessons Learned

These experiences have taught us valuable lessons:

  • Prioritise and Plan: Careful planning and prioritization are essential, especially when managing multiple projects simultaneously.
  • Embrace Flexibility: Be prepared to adapt to unexpected challenges and changes in scope.
  • Build Strong Relationships: Strong relationships with clients and team members are crucial for successful project delivery.
  • Learn from Mistakes: Analyse past projects to identify areas for improvement and avoid repeating errors.

Successful outcomes for both clients

The upgrade was successful for both clients and went live on the same day. This was a testament to my team’s technical ability and tenacity to ensure we followed our upgrade checklist. All testing and end-user training went well. One of our key tenets of upgrades is to ensure that client communication around changed functionality and look and feel is explained, trained and tested. The stakeholders from our client side were great and the whole team went above and beyond during the upgrade and deployment.

Both the upgrade took under 6 weeks to complete with minimal disruption to the business.

A huge shoutout goes to Alpheus and Rajan for their stellar work on the NEWS migration, ensuring a smooth transition to the Cloud, and to Baburao for expertly managing the BlueScope upgrade, and overcoming every hurdle that came our way.

Your Turn

Have you faced similar challenges in managing multiple simultaneous projects? How did you overcome them? Share your experiences and insights in the comments below.

If you are looking to modernise or upgrade your IBM Planning Analytics, then contact us and we would be happy to guide you. 

Line

Integrating transactions logs to web services for PA on AWS using REST API

Mode_Comment_Icon_black0
Alarm_Icon_15 min

In this blog post, we will showcase the process of exposing the transaction logging on Planning Analytics (PA) V12 on AWS to the users. Currently, in Planning Analytics there is no user interface (UI) option to access transaction logs directly from Planning Analytics Workspace. However, there is a workaround to expose transactions to a host server and access the logs. By following these steps, you can successfully access transaction logged in Planning Analytics V12 on AWS using REST API.

integratetranslogs-ezgif.com-optimize

Step 1: Creating an API Key in Planning Analytics Workspace

The first step in this process is to create an API key in Planning Analytics Workspace. An API key is a unique identifier that provides access to the API and allows you to authenticate your requests.

  1. Navigate to the API Key Management Section: In Planning Analytics Workspace, go to the administration section where API keys are managed.
  2. Generate a New API Key: Click on the option to create a new API key. Provide a name and set the necessary permissions for the key.
  3. Save the API Key: Once the key is generated, save it securely. You will need this key for authenticating your requests in the following steps.

Step 2: Authenticating to Planning Analytics As a Service Using the API Key

Once you have the API key, the next step is to authenticate to Planning Analytics as a Service using this key. Authentication verifies your identity and allows you to interact with the Planning Analytics API.

  1. Prepare Your Authentication Request: Use a tool like Postman or any HTTP client to create an authentication request.
  2. Set the Authorization Header: Include the API key in the Authorization header of your request. The header format should be Authorization: Bearer <API Key>.
  3. Send the Authentication Request: Send a request to the Planning Analytics authentication endpoint to obtain an access token.

Detailed instructions for Step 1 and Step 2 can be found in the following IBM technote:

How to Connect to Planning Analytics as a Service Database using REST API with PA API Key

Step 3: Setting Up an HTTP or TCP Server to Collect Transaction Logs

In this step, you will set up a web service that can receive and inspect HTTP or TCP requests to capture transaction logs. This is crucial if you cannot directly access the AWS server or the IBM Planning Analytics logs.

  1. Choose a Web Service Framework: Select a framework like Flask or Django for Python, or any other suitable framework, to create your web service.
  2. Configure the Server: Set up the server to listen for incoming HTTP or TCP requests. Ensure it can parse and store the transaction logs.
  3. Test the Server Locally: Before deploying, test the server locally to ensure it is correctly configured and can handle incoming requests.

For demonstration purposes, we will use a free web service provided by Webhook.site. This service allows you to create a unique URL for receiving and inspecting HTTP requests. It is particularly useful for testing webhooks, APIs, and other HTTP request-based services.

Step 4: Subscribing to the Transaction Logs

The final step involves subscribing to the transaction logs by sending a POST request to Planning Analytics Workspace. This will direct the transaction logs to the web service you set up.

Practical Use Case for Testing IBM Planning Analytics Subscription

Below are the detailed instructions related to Step 4:

  1. Copy the URL Generated from Webhook.site:
    • Visit siteand copy the generated URL (e.g., https://webhook.site/<your-unique-id>). The <your-unique-id> refers to the unique ID found in the "Get" section of the Request Details on the main page.

  1. Subscribe Using Webhook.site URL:
    • Open Postman or any HTTP client.
    • Create a new POST request to the subscription endpoint of Planning Analytics.
    • In Postman, update your subscription to use the Webhook.site URL using the below post request:

  • In the body of the request, paste the URL generated from Webhook.site:

{
 "URL": "https://webhook.site/your-unique-id"
}
<tm1db> is a variable that contains the name of your TM1 database.

Note: Only the transaction log entries created at or after the point of subscription will be sent to the subscriber. To stop the transaction logs, update the POST query by replacing /Subscribe with /Unsubscribe.

By following these steps, you can successfully enable and access transaction logs in Planning Analytics V12 on AWS using REST API.

Line

Tips on how to manage your Planning Analytics (TM1) effectively

Mode_Comment_Icon_black0
Alarm_Icon_13 min

Effective management of Planning Analytics (TM1), particularly with tools like IBM’s TM1, can significantly enhance your organization’s financial planning and performance management. 

TM1 newsletter

Here are some essential tips to help you optimize your Planning Analytics (TM1) processes:

1. Understand Your Business Needs

Before diving into the technicalities, ensure you have a clear understanding of your business requirements. Identify key performance indicators (KPIs) and metrics that are critical to your organization. This understanding will guide the configuration and customization of your Planning Analytics model.

2. Leverage the Power of TM1 Cubes

TM1 cubes are powerful data structures that enable complex multi-dimensional analysis. Properly designing your cubes is crucial for efficient data retrieval and reporting. Ensure your cubes are optimized for performance by avoiding unnecessary dimensions and carefully planning your cube structure to support your analysis needs.

3. Automate Data Integration

Automating data integration processes can save time and reduce errors. Use ETL (Extract, Transform, Load) tools to automate the extraction of data from various sources, its transformation into the required format, and its loading into TM1. This ensures that your data is always up-to-date and accurate.

4. Implement Robust Security Measures

Data security is paramount, especially when dealing with financial and performance data. Implement robust security measures within your Planning Analytics environment. Use TM1’s security features to control access to data and ensure that only authorized users can view or modify sensitive information.

5. Regularly Review and Optimize Models

Regularly reviewing and optimizing your Planning Analytics models is essential to maintain performance and relevance. Analyze the performance of your TM1 models and identify any bottlenecks or inefficiencies. Periodically update your models to reflect changes in business processes and requirements.

6. Utilize Advanced Analytics and AI

Incorporate advanced analytics and AI capabilities to gain deeper insights from your data. Use predictive analytics to forecast future trends and identify potential risks and opportunities. TM1’s integration with other IBM tools, such as Watson, can enhance your analytics capabilities.

7. Provide Comprehensive Training

Ensure that your team is well-trained in using Planning Analytics and TM1. Comprehensive training will enable users to effectively navigate the system, create accurate reports, and perform sophisticated analyses. Consider regular training sessions to keep the team updated on new features and best practices.

8. Foster Collaboration

Encourage collaboration among different departments within your organization. Planning Analytics can serve as a central platform where various teams can share insights, discuss strategies, and make data-driven decisions. This collaborative approach can lead to more cohesive and effective planning.

9. Monitor and Maintain System Health

Regularly monitor the health of your Planning Analytics environment. Keep an eye on system performance, data accuracy, and user activity. Proactive maintenance can prevent issues before they escalate, ensuring a smooth and uninterrupted operation.

10. Seek Expert Support

Sometimes, managing Planning Analytics and TM1 can be complex and may require expert assistance. Engaging with specialized support services can provide you with the expertise needed to address specific challenges and optimize your system’s performance.

By following these tips, you can effectively manage your Planning Analytics environment and leverage the full potential of TM1 to drive better business outcomes. Remember, continuous improvement and adaptation are key to staying ahead in the ever-evolving landscape of financial planning and analytics.

For specialized TM1 support and expert guidance, consider consulting with professional service providers like Octane Software Solutions. Their expertise can help you navigate the complexities of Planning Analytics, ensuring your system is optimized for peak performance. Book me a meeting

Line

Saying Goodbye to Cognos TM1 10.2.x: Changes in support effective April 30, 2024

Mode_Comment_Icon_black0
Alarm_Icon_12 min

In a recent announcement, IBM unveiled changes to the Continuing Support program for Cognos TM1, impacting users of version 10.2.x. Effective April 30, 2024, Continuing Support for this version will cease to be provided. Let's delve into the details.

blog (1)

What is Continuing Support?

Continuing Support is a lifeline for users of older software versions, offering non-defect support for known issues even after the End of Support (EOS) date. It's akin to an extended warranty, ensuring users can navigate any hiccups they encounter post-EOS. However, for Cognos TM1 version 10.2.x, this safety net will be lifted come April 30, 2024.

What Does This Mean for Users?

Existing customers can continue using their current version of Cognos TM1, but they're encouraged to consider migrating to a newer iteration, specifically Planning Analytics, to maintain support coverage. While users won't be coerced into upgrading, it's essential to recognize the benefits of embracing newer versions, including enhanced performance, streamlined administration, bolstered security, and diverse deployment options like containerization.

How Can Octane Assist in the Transition?

Octane offers a myriad of services to facilitate the transition to Planning Analytics. From assessments and strategic planning to seamless execution, Octane support spans the entire spectrum of the upgrade process. Additionally, for those seeking long-term guidance, Octane  Expertise provides invaluable Support Packages on both the Development and support facets of your TM1 application.

FAQs:

  • Will I be forced to upgrade?

    No, upgrading is not mandatory. Changes are limited to the Continuing Support program, and your entitlements to Cognos TM1 remain unaffected.

  • How much does it cost to upgrade?

    As long as you have active Software Subscription and Support (S&S), there's no additional license cost for migrating to newer versions of Cognos TM1. However, this may be a good time to consider moving to the cloud. 

  • Why should I upgrade?

    Newer versions of Planning Analytics offer many advantages, from improved performance to heightened security, ensuring you stay ahead in today's dynamic business environment. This brings about unnecessary risk to your application.

  • How can Octane help me upgrade?

    Octane’s suite of services caters to every aspect of the upgrade journey, from planning to execution. Whether you need guidance on strategic decision-making or hands-on support during implementation, Octane is here to ensure a seamless transition. Plus we are currently offering a fixed-price option for you to move to the cloud. Find out more here 

In conclusion, while bidding farewell to Cognos TM1 10.2.x may seem daunting, it's also an opportunity to embrace the future with Planning Analytics. Octane stands ready to support users throughout this transition, ensuring continuity, efficiency, and security in their analytics endeavours.

Line

Mastering Calculations in Planning Analytics: Adapt to Changing Months with Ease

Mode_Comment_Icon_black0
Alarm_Icon_16 min

One of the standout features of Planning Analytics Workspace (PAW) is its ability to create calculations in the Exploration view. This feature empowers users to perform advanced calculations without the need for technical expertise. Whether you're using PAW or PAfE (Planning Analytics for Excel), the Exploration view offers a range of powerful capabilities. The Exploration view supports a variety of functions, such as aggregations, mathematical operations, conditional logic, and custom calculations. This means you have the flexibility to perform complex calculations tailored to your specific needs. 

This enables users to create complex financial calculations and business rules within the views, providing more accurate and tailored results for analysis and planning. All this can be done by the business users themselves without relying on IT or development teams, enabling faster and more agile reporting processes. This enables creating ad hoc reports and performing self-service analysis on the fly with a few simple clicks. This self-service capability puts the control in the hands of the users, eliminating the need for lengthy communication processes or waiting for IT teams to fulfill reporting requests.

In this blog post, we will focus on an exciting aspect of the Exploration view: creating MDX-based views that are dynamic and automatically update as your data changes. The beauty of these dynamic views is that users no longer need to manually select members of dimensions to keep their formulas up to date.

Similar to the functionality of dynamic subsets in dimensions, where each click in the set editor automatically generates MDX statements that can be modified, copied, and pasted, the exploration views in Planning Analytics Workspace also generate MDX statements. These MDX statements are created behind the scenes as you interact with the cube view. Just like MDX subsets, these statements can be easily customized, allowing you to fine-tune and adapt them to your specific requirements.

By being able to tweak, copy, and paste these MDX statements, you can easily build upon previous work or share your calculations with others.

Currently, the calculations are not inherently dynamic, however, there are techniques that can be employed to make the calculations adapt to changing time periods.

A classic example we can look at is performing variance analysis on P&L cube where we wish to add a variance formula to show the variance of current month from the previous month. There are many more calculations that we can consider from but we will focus on this analysis in this blog.

If we take our example, the current month and previous month keep changing every month as we roll forward and they are not static. When dealing with changing months or any member in your calculation, it's important to ensure that your calculations remain dynamic and adaptable to those changes. 

To ensure dynamic calculations that reflect changes in months, you have several options to consider:

Manual Approach: You can manually update the column dimensions with the changing months and recreate the calculations each time. However, this method is time-consuming, prone to errors, and not ideal for regular use.

Custom MDX Approach: Another option is to write custom MDX code or modify existing code to reference the months dynamically from a Control cube. While this approach offers flexibility, it can be too technical for end users.

Consolidations Approach: Create consolidations named "Current Month" and "Prior Month" and add the respective months to them as children. Then, use these consolidations in your view and calculations. This approach provides dynamic functionality, but you may need to expand the consolidations to see the specific months, which can be cumbersome.

Alias Attributes Approach: Leverage alias attributes in your MDX calculations. By assigning aliases to the members representing the current and previous months, you can dynamically reference them in your calculations. This approach combines the benefits of the previous methods, providing dynamic calculations, visibility of months, and ease of use without excessive manual adjustments.

In this blog post, we will focus on the alias attributes approach as a recommended method for achieving dynamic calculations in PAW or PAfE. We will guide you step-by-step through the process of utilizing alias attributes to ensure your calculations automatically adapt to changing months. By following this approach, you can simplify your calculations, improve efficiency, and enable non-technical users to perform dynamic variance analysis effortlessly.

To create dynamic calculations for variances between the current and prior month, you can follow these steps:

  • Step 1: Ensure you have an alias attribute available in your Month dimension. If not, create a new alias attribute specifically for this purpose.
  • Step 2: Update the alias with the values "Curr Month" and "Prior Month" for the respective months.
  • Step 3: Open the exploration view in PAW and select the two months (current and prior) on your column or row dimension. 
  • Step 4: Create your variance calculation using the exploration view's calculation capabilities. This could involve subtracting the P&L figures of the prior month from the current month, for example.
  • Step 5: Open the MDX code editor and replace the actual month names in the MDX code with the corresponding alias values you updated in Step 2. You can copy the code in Notepad and use the "Find and Replace" function to make this process faster and more efficient.

ezgif.com-video-to-gif (3)

By replacing the month names with the alias values, you ensure that the calculation remains dynamic and adapts to the changing months without manual intervention. When you update the alias values in the Month dimension, it will reflect in the exploration view. As a result, the months displayed in the view will be dynamically updated based on the alias values. This ensures that your calculations remain synchronized with the changing months without the need for manual adjustments.


Important Note: When selecting the months in set editor, it is crucial to explicitly select and move the individual months from the Available members' pane (left pane) to the Current set pane (right pane). This step is necessary to ensure that unnecessary actions, such as expanding a quarter to select a specific month, are not recorded in the MDX code generated in the exploration view which can potentially lead to issues while replacing the member names with alias values. 

This approach of using alias attributes to make calculations dynamic can be extended to various other calculations in Planning Analytics Workspace. It provides a flexible and user-friendly method to ensure that your calculations automatically adapt to changing dimensions or members.

That being said, it's important to note that there may be certain scenarios where alternative approaches, such as writing custom MDX code or utilizing a control cube, are necessary. Each situation is unique, and the chosen approach should align with the specific requirements and constraints of the calculation, however the proposed approach should still work for a wide variety of calculations in IBM Planning Analytics.

Line

Exploring the Latest Enhancements of IBM Planning Analytics Components

Mode_Comment_Icon_black0
Alarm_Icon_13 min

As the world moves towards more data-driven decision-making, businesses are increasingly looking for effective planning and budgeting solutions. IBM Planning Analytics is the go-to for businesses looking for a comprehensive set of tools to help them manage their budgeting and planning process.

Slide2

With Planning Analytics, businesses can access powerful analytics to make more informed decisions, leverage advanced features to create complex models, and gain better insights into their financial data.

IBM is constantly improving the functionalities and features of the IBM Planning Analytics components. This includes Planning Analytics Workspace (PAW), Planning Analytics for Excel (PAfX), and Planning Analytics with Watson. With these updates, businesses can take advantage of new features to help them manage their budgeting and planning process more effectively.

In the last 12 months, IBM has released several updates to its Planning Analytics components.

In PAW, users can now access advanced analytics such as forecast simulations, predictive models, and scenario analysis. They can also perform in-depth analysis on their data with the new Visual Explorer feature. In addition, users can now access a library of planning and budgeting models, which can be customized to fit the needs of their organization. (download PDF file to get the full details)

Slide3download PDF file to get the full details

 

Slide6download PDF file to get the full details

In PAfX, users can now access advanced features such as SmartViews and SmartCharts. SmartViews allows users to visualize their data in various ways, while SmartCharts allows users to create interactive charts and graphs. Users can also take advantage of the new custom formatting options to make their reports look more professional.

Slide7download PDF file to get the full details

 

Slide8download PDF file to get the full details

Finally, with Planning Analytics with Watson, users can access powerful AI-driven insights. This includes AI-driven forecasting, which allows users to create more accurate forecasts. In addition, Watson can provide insights into the drivers of their business, allowing users to make more informed decisions.

 

Slide9download PDF file to get the full details

 

Overall, IBM’s updates to the Planning Analytics components provide businesses with powerful tools to help them manage their budgeting and planning process. With these updates, businesses can take advantage of the latest features to quickly access data-driven insights, create more accurate forecasts, and gain better insights into their financial data.

Download the PDF file below to get the full version of each IBM Planning Analytics components.

Line

Top 12 Planning Analytics features that you should be using in 2023

Mode_Comment_Icon_black0
Alarm_Icon_18 min

Amin Mohammad, the IBM Planning Analytics Practice Lead at Octane Solutions, is taking you through his top 12 capabilities of Planning Analytics, in 2023. These are his personal favorites and there could be more than what he is covering.

Top 12 picks of Planning Analytics

He has decided to divide his list into PAFe and PAW, as they have their own unique capabilities, and to highlight them separately. 

Planning Analytics for Excel (PAfE)

1. Support for alternate hierarchies in TM1 Web and PAfE

Starting with TM1 Set function, which has finally opened the option to use alternate hierarchies in TM1 web. it contains nine arguments as opposed to the four in SubNM adding to its flexibility. It also supports MDX expressions as one of the arguments. This function can be used as a good replacement for SubNM.

2. Updated look for cube viewer and set editor

The Planning Analytics Workspace and Cognos Analytics have taken the extra step to provide a consistent user experience. This includes the incorporation of the Carbon Design Principles, which have been implemented in the Set Editor and cube viewer n PaFe. This allows users to enjoy an enhanced look and feel of certain components within the software, as well as improved capabilities. This is an excellent addition that makes the most out of the user experience.

3. Creating User Define Calculations (UDC)

Hands down, the User Defined Calculations is by far the most impressive capability added recently. This capability allows you to create custom calculations using the Define calc function in PAFe, which also works in TM1 Web. With this, you can easily perform various calculations such as consolidating data based on a few selected elements, performing arithmetic calculations on your data, etc. Before this capability, we had to create custom consolidation elements in the dimension itself to achieve these results in PAfE, leading to multiple consolidated elements within the dimension, making it very convoluted. Tthe only downside is that it can be a bit technical for some users who use this, making it a barrier to mass adoption. Additionally, the sCalcMun argument within this function is case-sensitive, so bear that in mind. Hoping this issue is fixed in future releases.

4. Version Control utility

The Version Control utility helps to validate whether the version of Pathway you are using is compatible with the data source version of Planning Analytics Logo. If the two versions are not compatible, you cannot use Pathway until you update the software. The Version Control uses three capability or compatibility types to highlight the status of the compatibility:

  • normal
  • warning
  • blocked

Administrators can also configure the Version Control to download a specific version of Pathway when the update button is clicked, helping to ensure the right version of Pathway is used across your organization.

Planning Analytics Workspace (PAW)

5. Single Cell widget

Planning Analytics Workspace has recently added the Single Cell widget as a visualization, making it easier to update dimension filters. Before this, the Single Cell widget could be added by right-clicking a particular data point, but it had its limitations. 

One limitation that has been addressed is the inability to update dimension filters in the canvas once the widget has been added. In order to update it, one has to redo all steps, but the single widget visualization has changed this. Now, users can change the filters and the widget will update the data accordingly. This has been a great improvement as far as enhancing user experience goes.

Additionally, the widget can be transformed into any other visualization and vice versa. When adding the widget, the data point that was selected at that point is reflected in it. If nothing is selected, the top left of the first data point in the view is used to create the widget.Single cell widget

 

6. Sending email notifications to Contributors

You can now easily send email notifications to contributors with the click of a button from the Contribution Panel of the Overview Report. When you click the button, it sends out an email to the members of the group that has been assigned the task. The email option is only activated when the status is either pending approval or pending submission. Clicking the icon will send the email to all the members assigned to the group for the task.Email notification to contributors

7. Add task dependencies

Now, you can add task dependencies to plans, which allows you to control the order in which tasks can be completed. For example, if there are two tasks and Task Two is dependent on Task One, Task Two cannot be opened until Task One is completed. This feature forces users to do the right thing by opening the relevant task and prevents other tasks from being opened until the prerequisite task is completed. This way, users are forced to follow the workflow and proceed in the right order.

8. Approval and Rejections in Plans with email notifications

The email notifications meintioned here are not manually triggered like the ones in the 6th top picks. These emails are fully automated and event-based. The events that trigger these emails could be opening a plan step, submitting a step, or approving or rejecting a step. The emails that are sent out will have a link taking the user directly to the plan step in question, making the planning process easier for the users to follow.

light bulb

"The worklow capabilities of the Planning Analytics Workspace have seen immense improvements over time. It initially served as a framework to establish workflows, however, now it has become a fully matured workflow component with many added capabilities. This allows for a more robust and comprehensive environment for users, making it easier to complete tasks."

9. URL to access the PAW folder

PAW (Planning Analytics Workspace) now offers the capability to share links to a folder within the workspace. This applies to all folders, including the Personal, Favorites, and Recent tabs. This is great because it makes it easier for users to share information, and also makes the navigation process simpler. All around, this is a good addition and definitely makes life easier for the users.

10. Email books or views

The administrator can now configure the system to send emails containing books or views from Planning Analytics Workspace. Previously, the only way to share books or views was to export them into certain formats. However, by enabling the email functionality, users are now able to send books or views through email. Once configured, an 'email' tab will become available when viewing a book, allowing users to quickly and easily share their content. This option was not previously available.

11. Upload files to PA database​

Workspace now allows you to upload files to the Planning Analytics database. This can be done manually using the File Manager, which is found in the Workbench, or through a TI process. IBM has come up with a new property within the action button that enables you to upload the file when running the TI process. Once the file is uploaded, it can be used in the TI process to load data into TM1. This way, users do not have to save the file in a shared location and can simply upload it from their local desktop and load the data. This is a handy new functionality that IBM has added. Bear in mind that the file cannot be run until it has been successfully uploaded, so if the file is large, it may take time.

12. Custom themes​

Finally, improvements in custom themes. Having the ability to create your own custom themes is incredibly helpful in order to align the coloring of your reports to match your corporate design. This removes the limitation of only being able to use pre-built colors and themes, and instead allows you to customize it to your specific requirements. This gives you the direct functionality needed to make it feel like your own website when any user opens it.

That's all I have for now. I hope you found these capabilities insightful and worth exploring further.

If you want to see the full details of this blogpost. Click here

Line

Planning Analytics Audit log – Little known pitfall

Mode_Comment_Icon_black0
Alarm_Icon_12 min

The blogs brief about the challenge faced post enabling the Audit log in one of our client's environment. Once the audit log was turned on to capture the metadata changes, the Data Directory backup scheduled process started to fail.

After some investigation, I found the cause was the temp file (i.e., tm1rawstore.<TimeStamp> ) generated by the audit log by default and placed in the data directory.

The Temp file is used by audit log to record the events before moving it to a permanent file (i.e., tm1auditstore<TimeStamp>). Sometimes, you may even notice dimension related files (i.e., DimensionName.dim.<Timestamp>), and these files are generated by audit log to capture the dimension related changes.

The RawStoreDirectory is a tm1.cfg parameter related to the audit log, which helped us resolve the issue. This parameter is used to define the folder path for temporary, unprocessed log files specific to the audit log, i.e., tm1rawstore.<TimeStamp>, DimensionName.dim.<Timestamp>. If this Config is not set, then by default, these files get placed in Data Directory.

RawStoreDirectory = <Folderpath>

 

Now, let's also see other config parameters related to the audit logs

 

AuditLogMaxFileSize:

The config parameter can be used to control the maximum size audit log file to be before the file gets saved and a new file is created. The unit needs to be appended at the end of the value defined ( KB, MB, GB), and Minimum is 1KB and Maximum is 2GB; if this is not specified in the TM1 Cfg then the default value would be 100 MB.

AuditLogMaxFileSize=100 MB

 

AuditLogMaxQueryMemory:

The config parameter can be used to control maximum memory the TM1 server can use for running audit log query and retrieving the set. The unit needs to be appended at the end of the value defined ( KB, MB, GB) and Minimum is 1KB and Maximum is 2GB; if this is not specified in the TM1 Cfg then the default value would be 100 MB.

AuditLogMaxQueryMemory=200 MB


AuditLogUpdateInterval:

The config parameter can be used to control the amount of time the TM1 server needs to wait before moving the contents from temporary files to a final audit log file. The value is taken in minutes; that is, say 100 is entered, then it is taken has 100 minutes.

AuditLogUpdateInterval=100

 

That's it folks, hope you had learnt something new from this blog.

Line

Planning Analytics for Excel: Trace TI status

Mode_Comment_Icon_black0
Alarm_Icon_12 min

IBM has been recommending its users to move to Planning Analytics for Excel (PAX) from TM1 Perspective and/or from TM1 Web. This blog is dedicated to clients who have either recently adopted PAX or contemplating too and sharing steps on how to trace/watch TI process status while running process using Planning Analytics for Excel.

Steps below should be followed to run processes and to check TI process status.

1. Once you connect to Planning Analytics for Excel, you will be able to see cubes on the right-hand side, else you may need to click on Task Pane.

 
pax1

 

2. Click on the middle icon as shown below and click on Show process. This will help show all process (to which respective user has access to) in Task Pane.

 
pax2

 

3. You will now be able to see Process.

 

pax3

 

4. To check/ trace status of the process (when triggered via Planning analytics for excel) right-Click on Processes and click Active processes.

 

pax4

 

 

5. A new box will pop-up as shown below.

 
pax5

 

6. You can now run process from Task pane and check if you can track status in new box popped up in step 5.

 

pax6

 

 

7. You can now see the status of process in this box, below is a screen print that shows the for-process cub.price.load.data, process completed 4 tasks out of 5 tasks.

pax7

 

8. Below screen prints tells us if the status of TI process, they are Working , Completed and Process completed with Errors.

pax8

 

Once done, your should be able to to trace TI status in Planning Analytics for Excel. Happy Transitioning.

As I pen down my last Blog for 2019, wishing you and your dear ones a prosperous and healthy 2020.

Until next time....keep planning & executing.

 

Line

IBM Planning Analytics Secure Gateway Client: Steps to Set-Up

Mode_Comment_Icon_black0
Alarm_Icon_17 min

This blog broaches all steps on how to install IBM Secure Gateway Client.

IBM Secure Gateway Client installation is one of the crucial steps towards setting up secure gateway connection between Planning Analytics Workspace (On-Cloud) and RDBMS (relational database) on-premise or on-cloud.

Picture1-22-1

What is IBM Secure Gateway :

IBM Secure Gateway for IBM Cloud service provides a quick, easy, and secure solution establishing a link between Planning Analytics on cloud and a data source. Data source can reside on an “on-premise” network or on “cloud”. Data sources like RDBMS, for example IBM DB2, Oracle database, SQL server, Teradata etc.

Secure and Persistent Connection :

A Secure Gateway, useful in importing data into TM1 and drill through capability, must be created using TurboIntegrator to access RDBMS data sources on-premise.

By deploying the light-weight and natively installed Secure Gateway Client, a secure, persistent and seamless connection can be established between your on-premises data environment and cloud.

The Process:

This is two-step process,

  1. Create Data source connection in Planning Analytics Workspace.
  2. Download and Install IBM Secure Gateway

To download IBM Secure Gateway Client.

  1. Login to Workspace ( On-Cloud)
  2. Navigate to Administrator -> Secure Gate

Picture2-5

Click on icon as shown below, this will prompt a pop up. One needs to select operating system and follow steps to install the client.
Picture3-3

Once you click, a new pop-up with come up where you are required to select the operating system where you want to install this client.

Picture4-2

Choose the appropriate option and click download.

If the download is defaulted to download folders you will find the software in Download folder like below.

Picture5-2

Installation IBM Secure Gateway Client:

To Install this tool, right click and run as administrator.

Picture6-2

 

Keep the default settings for Destination folder and Language, unless you need to modify.

Picture7-1

Check box below if you want this as Window Service.

Picture8-2

Now this is an important step, we are required to enter Gateway ids and security tokens to establish a secured connection. These needs to be copied over from Secure connection created earlier in Planning Analytics Workspace ( refer 1. Create Data source connection in workspace).

Picture9-2

Figure below illustrates Workspace, shared details on Gateway ID and Security Token, these needs to be copied and pasted in Secure Gateway Client (refer above illustration).

Picture10-1

If user chooses to launch the client with connection to multiple gateways, one needs to take care while providing the configuration values.

  1. The gateway ids need to be separated by spaces.
  2. The security tokens, acl files and log levels should to be delimited by --.
  3. If you don't want to provide any of these three values for a particular gateway, please use 'none'.
  4. If you want Client UI you may choose else select No.

Note: Please ensure that there are no residual white spaces.

Picture11-3

Now click Install, once this installation completes successfully, the IBM Secure Gateway Client is ready for use.

This Connection is now ready, Planning Analytics can now connect to data source residing on-premise or any other cloud infrastructure where IBM Secure Gateway client is installed.

 

You may also like reading “ Predictive & Prescriptive-Analytics ” , “ Business-intelligence vs Business-Analytics ” ,“ What is IBM Planning Analytics Local ” , “IBM TM1 10.2 vs IBM Planning Analytics”, “Little known TM1 Feature - Ad hoc Consolidations”, “IBM PA Workspace Installation & Benefits for Windows 2016”.

Line

What is IBM Watson™ Studio?

Mode_Comment_Icon_black0
Alarm_Icon_11 min

IBM Watson™ Studio is a platform for businesses to prepare and analyse data as well as build and train AI and machine learning models in a flexible hybrid cloud environment.

IBM Watson™ Studio enables your data scientists, application developers and subject matter experts work together easier and collaborate with the wider business, to deliver faster insights in a governed way.

Watch the below for another brief intro



Available in on the desktop which contains the most popular portions of Watson Studio Cloud to your Microsoft Windows or Apple Mac PC with IBM SPSS® Modeler, notebooks and IBM Data Refinery all within a single instal to bring you comprehensive and scalable data analysis and modelling abilities.

However, for the enterprise, there are also the versions of Watson Studio Local, which is a version of the software to be deployed on-premises inside the firewall, as well as Watson Studio Cloud is part of the IBM Cloud™, a public cloud platform. No matter which version your business may use you can start using Watson Studio Cloud and download a trial of the desktop version today!

Over the next 5 days, we'll ensure to send you use-cases and materials of worth for you to review at your earliest convenience. Be sure to check our social media pages for these.

Line

IBM Planning Analytics (TM1) Vs Anaplan

Mode_Comment_Icon_black0
Alarm_Icon_14 min

Picture15-1

IBM Planning Analytics (TM1) vs Anaplan

There has been a lot of chatter lately around IBM Planning Analytics (powered by TM1) vs Anaplan. Anaplan is a relatively new player in the market and has recently listed on NYSE. Reported Revenue in 2019 of USD 240.6M (interestingly also reported an operating loss of USD 128.3M). Compared to IBM which has a 2018 revenue of USD 79.5 Billion (there is no clear information on how much of this was from the Analytics area) with a net profit of 8.7 b). The size of global Enterprise Performance Management (EPM) is around 3.9 Billion and expected to grow to 6.0Billion by 2022. The size of spreadsheet based processes is a whopping 60 Billion (Source: IDC)

Anaplan has been borne out of the old Adaytum Planning application that was acquired by Cognos and Cognos was acquired by IBM in 2007. Anaplan also spent 176M on Sales and Marketing so most people in the industry would have heard of it or come across some form of its marketing. (Source: Anaplan.com)

I’ve decided to have a closer look at some of the crucial features and functionalities and assess how it really stacks up.

Scalability 

There are some issues around scaling up the Anaplan cubes where large datasets are under consideration (8 billion cell limit? While this sounds big, most of our clients reach this scale fairly quickly with medium complexity). With IBM Planning Analytics (TM1) there is no need to break up a cube into smaller cubes to meet data limits. Also, there is no demand to combine dimensions to a single dimension. Cubes are generally developed with business requirements in mind and not system limitations. Thereby offering superior degrees of freedom to business analyst.

For example, if enterprise wide reporting was the requirement, then the cubes may be need to be broken via a logical dimension like region of divisions. This in turn would make consolidated reporting laborious, making data slicing and dicing difficult, almost impossible.

 

Picture14-1-1  

                                                                                                                                   

Excel Interface & Integration

Love it or hate it – Excel is the tool of choice for most analyst and finance professionals. I reckon it is unwise to offer a BI tool in today’s world without a proper excel integration.  I find Planning Analytics (TM1) users love the ability to use excel interface to slice and dice, drill up and down hierarchies and drill to data source. The ability to create interactive excel reports with ability to have cell by cell control of data and formatting is a sure-shot deal clincher.

On the other hand, on exploration realized Anaplan offers very limited Excel support.

 Picture11-2Picture12-1

 

 Analysis & Reporting

In today’s world users have come to expect drag and drop analysis. Ability to drill down, build and analyze alternate view of the hierarchy etc “real-time”. However, if each of this query requires data to be moved around cubes and/or requires building separate cubes then it’s counterproductive. This would also increase the maintenance and data storage overheads. You also lose sight of single source of truth as your start developing multiple cubes with same data just stored in different form. This is the case with Anaplan due to the software’s intrinsic limitations.

Anaplan also requires users to invest on separate reporting layer as it lacks native reporting, dashboards and data visualizations.

This in turn results in,

  1. Increase Cost
  2. Increase Risk
  3. Increase Complexity
  4. Limited planning due to data limitations

IBM Planning Analytics, on the contrary offers out of the box ability to view & analyze all your product attributes and the ability to slice and dice via any of the attributes. 

It also comes with a rich reporting, dashboard and data visualization layer called Workspace. Planning Analytics Workspace delivers a self-service web authoring to all users. Through the Planning Analytics Workspace interface, authors have access to many visual options designed to help improve financial input templates and reports. Planning Analytics Workspace benefits include:

  1. Free-form canvas dashboard design
  2. Data entry and analysis efficiency and convenience features
  3. Capability to combine cube views, web sheets, text, images, videos, and charts
  4. Synchronised navigation for guiding consumers through an analytical story
  5. Browser and mobile operation
  6. Capability to export to PowerPoint or PDF

Picture13-1

Source : Planning Analytics (TM1) cube

Line

Planning Analytics - Cloud Or On-Premise

Mode_Comment_Icon_black0
Alarm_Icon_14 min

cloudsaas-1

This Blog details IBM Planning Analytics On-Cloud and On-Premise deployment options. It focusses & highlights key points which should help you make the decision; “whether to adopt Cloud Or stay on Premise”

 

IBM Planning Analytics:

As part of their continuous endeavour to improve application interface and better customer experience, IBM rebranded TM1 to Planning Analytics couple of years back which came with many new features and a completely new interface. With this release (PA 2.x version as it has been called), IBM is letting clients choose Planning Analytics as Local SW or as Software as a Service (SaaS) deployed on IBM Softlayer Cloud.

cloud-vs-on-premise-1280x720-1

Planning Analytics on Cloud:

Under this offering, Planning Analytics system operates in a remote hosted environment. Clients who choose Planning Analytics deployed “on-cloud” can reap many benefits aligned to any typical SaaS.

With this subscription, Clients’ need not worry about software Installation, versions, patches, upgrades, fixes, disaster recovery, hardware etc.

They can focus on building business models and enriching data from different source systems and give meaning to the data they have. This by converting data into business critical, meaningful, actionable insights.

Benefits:

While not a laundry list, covers significant benefits.

  • Automatic software updates and management.
  • CAPEX Free; incorporates benefits of leasing.
  • Competitiveness; long term TCO savings.
  • Costs are predictable over time.
  • Disaster recovery; with IBM’s unparalleled global datacentre reach.
  • Does not involve additional hardware costs.
  • Environment friendly; credits towards being carbon neutral.
  • Flexibility; capacity to scale up and down.
  • Increased collaboration.
  • Security; with options of premium server instances.
  • Work from anywhere; there by driving up productivity & efficiencies.

Client must have Internet connection to use SaaS and of course, Internet speed plays major role. In present world Internet connection has become a basic necessity for all organizations.

Picture11-1

Planning Analytics Local (On-Premise):

Planning Analytics local essentially is the traditional way of getting software installed on company’s in-house server and computing infrastructure installed either in their Data Centre or Hosted elsewhere.

In an on-premise environment - Installation, upgrade, and configuration of IBM® Planning Analytics Local software components are on the Organization.

Benefits of On-Premise:

  • Full control.
  • Higher security.
  • Confidential business information remains with in Organization network.
  • Lesser vendor dependency. 
  • Easier customization.
  • Tailored to business needs.
  • Does not require Internet connectivity, unless “anywhere” access is enabled.
  • Organization has more control over implementation process.

As evident on-premise option comes with some cons as well, few are listed below.

  • Higher upfront cost
  • Long implementation period.
  • Hardware maintenance and IT cost.
  • In-house Skills management.
  • Longer application dev cycles.
  • Robust but inflexible.

On-premise software demands constant maintenance and ongoing servicing from the company’s IT department.

Organization on on-premise have full control on the software and on its related infrastructure and can perform internal and external audits as and when needed or recommended by governing/regulatory bodies.

Before making the decision, it is also important to consider many other influencing factors; from necessary security level to the potential for customization, number of Users, modelers, administrators, size of the organization, available budget, long term benefits to the Organization.

While you ponder on this, there are many clients who have adopted a “mid-way” of hybrid environment. Under which basis factors like workload economics, application evaluation & assessment, security and risk profiles, applications are being gradually moved from on-premise to cloud in a phased manned.

 

You may also like reading “ What is IBM Planning Analytics Local ” , “IBM TM1 10.2 vs IBM Planning Analytics”, “Little known TM1 Feature - Ad hoc Consolidations”, “IBM PA Workspace Installation & Benefits for Windows 2016”.

For more Information: To check on your existing Planning Analytics (TM1) entitlements and understand how to upgrade to Planning Analytics Workspace (PAW) reach out to us at info@octanesolutions.com.au for further assistance.

Octane Software Solutions Pty Ltd is an IBM Registered Business Partner specialising in Corporate Performance Management and Business Intelligence. We provide our clients advice on best practices and help scale up applications to optimise their return on investment. Our key services include Consulting, Delivery, Support and Training. Octane has its head office in Sydney, Australia as well as offices in Canberra, Bangalore, Gurgaon, Mumbai, and Hyderabad.

To know more about us visit, OctaneSoftwareSolutions.

Line

Is Your Data Good Enough for Business Intelligence Decisions?

Mode_Comment_Icon_black0
Alarm_Icon_13 min

business-intelligence-1

There’s no question that more and more enterprises are employing analytics tools to help in their strategic business intelligence decisions. But there’s a problem - not all source data is of a high quality.

Poor-quality data likely can’t be validated and labelled, and more importantly, organisations can’t derive any actionable, reliable insights from it.

So how can you be confident your source data is not only accurate, but able to inform your business intelligence decisions? It starts with high-quality software.

 

Finding the right software for business intelligence

There are numerous business intelligence services on the market, but many enterprises are finding value in IBM solutions. 

IBM’s TM1 couches the power of an enterprise database in the familiar environment of an Excel-style spreadsheet. This means adoption is quick and easy, while still offering you budgeting, forecasting and financial-planning tools with complete control.

Beyond the TM1, IBM Planning Analytics takes business intelligence to the next level. The Software-as-a-Service solution gives you the power of a self-service model, while delivering data governance and reporting you can trust. It’s a robust cloud solution that is both agile while offering foresight through predictive analytics powered by IBM’s Watson.

 

business-intelligence-3-1-1

 

Data is only one part of the equation

But it takes more than just the data itself to make the right decisions. The data should help you make smarter decisions faster, while your business intelligence solution should make analysing the data easier. 

So how do you ensure top-notch data? Consider these elements of quality data:

  • Completeness: Missing data values aren’t uncommon in most organisations’ systems, but you can’t have a high-quality database where the business-critical information is missing.
  • Standard format: Is there a consistent structure across the data – e.g. dates in a standard format – so the information can be shared and understood?
  • Accuracy: The data must be free of typos and decimal-point errors, be up to date, and be accurate to the expected ‘real-world’ values.
  • Timeliness: Is the data ready whenever it’s needed? Any delays can have major repercussions for decision-making.
  • Consistent: Data that’s recorded across various systems should be identical. Inconsistent datasets – for example, a customer flagged as inactive in one system but active in another – degrades the quality of information.
  • Integrity: Is all the data connected and valid? If connections are broken, for example if there’s sales data but no customer attached to it, then that raises the risk of duplicating data because related records are unable to be linked.

Are you looking to harness the power of your source data to make actionable business decisions? Contact Octane to find out how we can help you leverage your data for true business intelligence.

 

business-intelligence-2-1-1

 

Line

Self Service: How Big Data Analytics is Empowering Users

Mode_Comment_Icon_black0
Alarm_Icon_13 min

big-data-analytics-1

 

Smart businesses are seeking out new ways to leverage the benefits of their big data analytics programs, and the self-service model is coming up trumps. By placing the onus directly on business users, enterprises are empowering customers with insights-driven dashboards, reports, and more. But it’s not the only bonus. 

Arguably an even greater upside for organisations is that it alleviates the talent shortage that often comes with big data. With most companies only employing a handful of data experts who can deliver analytics insights to customers, the self-service model means they are freed up to concentrate on more important tasks, while allowing the masses to derive their own insights on their own terms. 

 

What are the real benefits of self service?

If nothing else, a self-service model creates a ‘democratisation’ of big data, giving users the freedom to access the data they need when they need it most: during the decision-making process.

Moreover, there’s a low cost to entry – coupled with reduced expenses thanks to freeing up data science and IT resources – and faster time to insight. When users know what they need and can change their research strategies according to new and changing demands, they become more empowered.

But it’s not all smooth sailing – giving customers the tools they need for self service is only one part of the equation. They must also be educated on the potential pitfalls.

 

big-data-analytics-2-1

 

Avoid the common hurdles

When several users have access to specific data, there’s a risk of multiple copies being made over time, thus compromising the ‘one version of truth’ and possibly damaging any insights that could be derived.

Business users unfamiliar with big data analytics are also prone to mistakes, as they may be unaware of data-preparation complexities – not to mention their own behavioural biases. 

For all these issues, however, education is the solution, which is what Ancestry.com focused on when it began encouraging self-service analytics through its new data-visualisation platform. And with 51 quintillion cells of data you can see why.

 

There’s no harm in starting small with big data analytics

Ancestry.com has over 10 billion historical records and about 10 million registered DNA participants, according to Jose Balitactac who is the FP&A Application Manager.

The old application they were using was taking hours to do the calculations.  They looked at seven different applications before deciding on IBM Planning Analytics.  

The reason they chose IBM Planning Analytics was to accommodate the company’s super-cube of data, other solutions would have required them to “break it into smaller cubes, or reduce the number of dimensions, or join members, such as business units and cost centers.” They didn’t want to do that because their processes worked.

They set up a test with IBM to time how long it took for the model to calculate and it took less than 10-20 seconds which is what they wanted. You can read more about the Ancestry.com case study here.

If you’re keen to empower your business users through a self-service model, contact Octane today to learn how we can help you harness big data analytics.

 

big-data-analytics-3-1-1

 

Got a question? Shoot!

Lorem ipsum dolor sit amet, consectetur adipiscing elit, sed do eiusmod tempor incididunt ut labore et dolore magna aliqua.

Get more articles like this delivered to your inbox