Monday, May 1, 2023

Smart list Value > Member Name

 

How to read a member name in a calcmgr rule from smart list value

Smart Lists
Service Administrators use Smart Lists to create custom drop-down lists that users access from form cells. When clicking in cells whose members are associated with a Smart List (as a member property), users select items from drop-down lists instead of entering data. Users can't type in cells that contain Smart Lists. Smart Lists display in cells as down arrows that expand when users click into the cells.

Smart lists will enhance the user interface and control while setting the assumptions and inputs that drives the further processing.

We all know that
In Calculation Manager, you can insert a Smart List into a variable or into formula, script, condition, and member range components.

Problem Statement


If you look at the above form, it is “add new asset” forms, part of capex model. New assets are introduced at different months. We should calculate the asset schedule from the period of the asset acquisition. That involves the calculation of Gross Asset, Depreciation, Accumulated Depreciation, Net Asset Value. Here we focus only on Depreciation Calculation as the purpose is to understand the member lookup from the smart list

For a given asset, “New_Asset_001”, I wanted to calculate the depreciation from May onwards, for the “New_Asset_002”, from Jan onwards, This should happen based on the form selections, and dynamic.
Code should not be required cycle through all twelve periods for each asset to figureout the asset start period.

Solution Part:

From the different sources I learned We could reverse engineer a member name from the smart list value as below:

@MEMBER(@CONCATENATE("HSP_ID_", @hspnumtostring("AssetStartPeriod"->"BegBalance")))

Okay!!!

Can I use this function in FIX/ENDFIX to get the list of periods from the asset form. Let’s try this

FIX(@MEMBER(@CONCATENATE("HSP_ID_", @hspnumtostring("AssetStartPeriod"->"BegBalance"))):"Dec")


No, it’s not working. Let’s try the same using if/endif.

if (@ISMBR(@MEMBER(@CONCATENATE("HSP_ID_", @hspnumtostring("AssetStartPeriod"->"BegBalance"))) : "Dec"))


Hmmm. This got validated! 
Let me execute and see if it works as expected.


Results are good. 


Let me modify the start periods and run







 


Looks alright!

How this works? We should know how system read it. Let me try with a return function to know how system read.

@return(@CONCATENATE("HSP_ID_", @hspnumtostring("AssetStartPeriod"->"BegBalance")),error);


  Cannot calculate. Essbase Error(0): HSP_ID_2155886880967624

  Can this “HSP_ID_2155886880967624 be an alias hidden inside and  represent the period "Feb".




If I supply this string into @member function, what it reads.





Here you go – It is Feb

Note:

It works only if smart list is created from members.








And see to that code return a proper value always. Any #MISSING or Invalid Numeric ID will not work. Let me comment the if ("AssetStartPeriod"->"BegBalance" <> #MISSING) so that system process other assets also with missing data



Here it tried to process "New_Asset_002" also, and failed as it returned invalid period in the smart list lookup
Let me revert the changes.


It worked here as it didn't process "New_Asset_002" as the condition if ("AssetStartPeriod"->"BegBalance" <> #MISSING) filtered the "New_Asset_002". 

Hope this helps. Thanks. 

 

Wednesday, April 19, 2023

Statement of Cash Flows if at all prepared by a Techie

Imagine that you run a retail store, you find your long-term loyal customers are short of money due to recession and job cuts, you let them order the groceries ‘on the credit. You are therefore creating turnover, but there is no money inflow stemming from this soon.

This means that no funds are flowing in for the purchase of new goods, payment of employees’ salaries and utility bills. While this problem does not appear on the income statement (groceries ‘on the credit are considered income), or only with a substantial delay, it becomes directly visible in the cash flow statement, as the net profit shown on the income statement is adjusted for transactions in which the company has not (yet) received an inflow of money.

How to calculate Statement of Cash Flows

The cash flow statement is the central element of any financial statement analysis. Since the income statement is not adjusted for non-cash items, only the cash flow statement shows the true cash flows to and from the company during the year.

We have two methods:
The direct method: reports gross cash inflows and gross outflows from operating activities but is not as 
widely used as the indirect method as it is more complex to implement.

The indirect method: uses a format that differs from the direct method only in the section where net cash provided or used by operating activities is calculated.

The investing and financing sections of the statement of cash flows are exactly the same under either method.

What do we need?
Information used to prepare this statement is obtained from the income statement for the year and comparative balance sheets for the last 2 years. Net income is adjusted in order to convert the accrual basis income statement to cash flows.

As discussed earlier, the statement which follows is divided into operating, investing, and financing activities. When we use the indirect method to determine cash flows, we start with the net income figure from the income statement and adjust the net income amount to determine the net amount of cash provided or used in operating activities.

Let us use the accounting equation and modify it slightly by splitting cash from assets.

1)    Cash + Non-Cash Assets = Liabilities + Equity

If we solve for Cash, we have:

2)    Cash = Liabilities + Equity − Non-Cash Assets

We use the symbol Δ(delta) to define “change in” and apply it to the revised accounting.

3)     ΔCash = ΔLiabilities + ΔEquity − ΔNon-Cash Assets

If we split now:

Non-Cash Assets = Fixed Assets (FA) + Accounts Receivable (AR) + Inventory (INV)

(FA) = CAPital EXpenditure (CAPEX) − Depreciation (Dep)

Liabilities = Long-Term Debt (LTD) + Short-Term Debt (STD) + Accounts Payable (AP)

D = Debt = LTD + STD

Equity = Share Capital (SC) + Retained Earnings (RE) = Net Income (NI) − Dividends (Div)

Retained Earnings (RE) = Net Income (NI) − Dividends (Div)

This means that by examining the liabilities, equity, and non-cash asset accounts for changes from one period to another we can explain the change in the cash account.

we may rewrite the equation 3 as follows:

4)    ΔCash = Δ(LTD + STD + AP) + Δ(SC + RE) − Δ(FA + AR + INV)

5)    ΔCash = ΔD + ΔAP + ΔSC + Δ(NI − Div) − Δ(CAPEX − Dep) − ΔAR – ΔINV

Then let me substitute. WCR = Working Capital Requirements = AR + INV – AP 

6)    ΔCash = ΔD − ΔWCR + ΔSC + ΔNI − ΔDiv – ΔCAPEX + ΔDep   

7)    ΔCash = (NI + Dep − ΔWCR) − ΔCAPEX + (ΔSC + ΔD − Div)

n  Operating Cash Flow = NI + Dep – ΔWCR

n  Investing Cash Flow = ΔCAPEX

n  Financing Cash Flow = ΔSC + ΔD − Div

8) Ending Cash Balance = Opening Cash Balance + ΔCash

Eq 8 
is a mathematical derivation of the cash flow statement under the indirect method.

Statement of Cash Flows is a great tool to reconcile the accrual method of accounting to the cash basis of accounting, and also to examine the changes in current assets and current liabilities.

Reference:

[1] Financial Forecasting, Analysis, and Modelling - A Framework for Long-Term Forecasting
MICHAEL SAMONAS


Monday, February 24, 2020

Set Commands - Do they really set Essbase on?


SET CACHE HIGH;
SET UPDATECALC ON;
SET LOCKBLOCK HIGH;
SET CALCPARALLEL 4;
SET FRMLBOTTOMUP ON;

I have many rules loaded with the SET Commands like above, in my EPBCS environment. Do they really SET ESSBASE ON?
Let's see one by one in detail----

a. SET CACHE - First let's see what techref telling.

Essbase uses the calculator cache to create and track data blocks during calculation. Using the calculator cache significantly improves your calculation performance. The size of the performance improvement depends on the configuration of your database.

You can choose one of three levels - high | low | default. Again, this is dependent on the CALCCACHE setting in the essbase.cfg file

will it help in the performance?
Definitely, if you're required to calculate at least one sparse dimension in full. You will have the additional resources in play to speed up the process.

Do I recommend this?
Yes and No. Yes, if you're running admin/batch jobs in serial mode. And, No if it's a planning business rule triggered by multiple users at the same time. In the second case, trying to tap more resources for multi-process can result in adverse performance.

b. SET UPDATECALC - This setting helps us to on/off the intelligent calculation

ON - Essbase calculates only blocks marked as dirty (see Description). Dirty blocks include updated blocks and their dependent parents (see Notes). The default setting is ON. You can change this default using the UPDATECALC TRUE | FALSE setting in the essbase.cfg file.

OFF - Essbase calculates all data blocks, regardless of whether they have been updated.

ON will help the performance. is that right?
Yes. But it calculates only the blocks that are marked dirty. May lead to data issues.

What do I recommend?
OFF - universally cuz data accuracy takes precedence over performance.
And, if you're intelligent enough to understand when the blocks will be marked dirty/clean, you're eligible to turn the intelligent calc on in a situation, can help performance.


c. SET LOCKBLOCK - Specifies the maximum number of blocks that Essbase can get addressability to concurrently when calculating a sparse member formula.

Error: 1012704 Dynamic Calc processor cannot lock more than [200] ESM blocks during the calculation, please increase CalcLockBlock setting and then retry(a small data cache setting could also cause this problem, please check the data cache size setting).

An error like above, push us to go and change this setting to the next level, only to make things worse.

What do I recommend?
You can try up to 1000 and not more than that. This is an indication of a problem with your code and outline structure. I suggest work on that. And, I strongly suggest setting the data cache and index cache to a reasonable level to avoid the possible second part of the error - "a small data cache setting could also cause this problem, please check the data cache size setting"


d. SET CALCPARELLEL - Enables parallel calculation in place of the default serial calculation.
Essbase analyzes each pass of a calculation to determine whether the parallel calculation is possible. If it is not, Essbase uses serial calculation even if CALCPARALLEL is set.

Many think that this is a mantra to improve performance. It is if it can create a task schedule that can be executed parallel. If you can't determine "whether the parallel calculation is possible or not", this is of no use and may do bad.

In the old world, we did use this in conjunction with SET CALCTASKDIM to determine the possible parallel task schedule. Now, it's implicit to the parallel command. But, still, the key is task schedule and non-empty tasks. You should prepare a case sheet and do trial and error to come up with the optimum setting. This involves your outline order also.

What do I recommend?
Use it in admin/batch jobs after the detailed study, findings, and results. Stay away from the rules that are for concurrent business users. The second case may eat into the system resources that are otherwise required for other activities.


e. SET FRMLBOTTOMUP - Optimizes the calculation of complex formulas on sparse dimensions in large database outlines. This command tells Essbase to perform a bottom-up calculation on formulas that would otherwise require a top-down calculation.

Otherwise, means? You're instructing it to go bottom-up. If not, it's intended behavior is to go top-down. 

why and when?
The formula contains complex functions (for example, range functions)
The formula's dependencies are not straightforward

what my set frmlbottomup do then?
Surely, It improves the performance at the cost of data accuracy.
How can I avoid that? Use not. Please!

Still, I wanted to use it in such a case, for which this command is introduced. suggest the way? 
Shit! Create parallel environments, on/off. simulate the full production cycle. review the logs and compare the results. Yes, you win, if no harm found. otherwise, a big ouch!

***** One last Q - if no use, why our pros are using it so heavily in the rules?
I am an Indian. Let me explain this from the Indian context. We pray many Gods. Each can give immunity and strength on one life skill or force. We want all; we don't want to disappoint any. So, we are. 
Same way, SET commands also helps you to create the right vehicle for your calc if used well. We want it. We use all so. is that explained? 😊
If you ask, I have seen Americans also used them alike. My answer is, given the chance they are also like us to pray many for more benefits. 😉



Thursday, February 6, 2020

Decoding EPM Cloud


What does Decoding mean? Let's see how Techopedia defines it.

Decoding is the process of converting code into plain text or any format that is useful for subsequent processes. Decoding is the reverse of encoding. It converts encoded data communication transmissions and files to their original states.

what am I going to do here now? am i going to decipher the crypto, Oracle didn't want us to hack into?

Not that. This is not into that direction. This is to discuss in detail about the behaviors and features that are implicit to the system, less/ miss understood. In the old school of Essbase, we call it - "Expected behavior vs Original Behavior"


How this post will be organised?

when it comes to the ordering of the topics, It's very unorganised. Be that as it may, but, with in the topic, it will be complete, detail and full explained.
This is the one that lives on and keep updated till my last coding day with EPM. After every 10 Qs/Topics, Next Part will be started.

Stay tuned to it for the original content.


******************************************************

1. Set Commands - Do they really set Essbase on?










Friday, December 27, 2019

EPM Cloud Design Considerations - Part I - Chart of Accounts


1. What is a chart of accounts (COA)?

Chart of Accounts is a key tool in capturing your transactional accounting information to be able to report in a manner that supports and benefits the organization's various requirements like consolidation, planning, and reporting.
The chart of accounts contains all the accounts used by a business. It defines how and what account titles to use in recording the transactions. Account titles may depend upon the lien of business, the type of ownership, preferences, and other factors.

2. Explain about the structure and representation of COA?

It is a combination of multiple fields that defines the complete transaction details. Each field we call it a segment in Oracle. So, Multiple segments put together is your chart of accounts.

Effectiveness of the current business process and future depends on how you chose the number of segments, length, names, and order.

COA Structure: Company-CostCenter-Account-SubAccount-Product-Future

Eg: 101-2020-401010-1210-13210-0000

Segment1 is a company/ legal entity, mostly, it will be the balancing segment

Segment3 is a natural account like Sales, Cost, Cash, and Asset.

Segment6 is for the future. It means you have a need for 5 segments now. Still, you are adding one extra that may need in the future for the effective classification of transactions.


3. I have a group company that has multiple companies and multiple lines of business. To add, more to the complexity, we use multiple accounting systems with multiple COAs ranging from bad to okay. If I offer a project to implement EPM & BI, how do you develop the reporting COA?

I wish I can explain this better in a pictorial representation. I am. I did.




4. This looks to be a complicated process. is there a way to simplify this?


Yes and No. Reporting COA Implementation is a logical solution. And, it is an upstream requirement created by the EPM & BI implementation. It is the basis for EPM & BI. You can't quit or go away from it.
But, if we take it up as a separate project, a Master Data project, we can simplify the EPM & BI implementation. And, at the same time, you can build a single source of truth across the organization, systems, and processes.
Oracle has a solution for this, called DRM - Data Relationship Management, running on Oracle Database and IIS.


5. After the implementation of Reporting COA, are we ready with all EPM Dimensions?

Mostly yes. Still, we need to make them ready for EPM based on the EPM Design.

6. For an EPM Project, Can you explain what's the road next after Reporting COA?

Once your Reporting COA/ Master Data is ready from the Organisation side. is it enough for EPM? definitely not. It is sure, by now, you have Metadata - Sets and Attributes, anything that is meaningful at the organization level and at finer detail. what exactly, we need for EPM depends on the below.
  1. Identify the Process and Sub-process
  2. Identify the grain
  3. Identify the dimensions
  4. Identify the facts
  5. Verify the model
7. What do you mean by Process and Sub process?

Processes are at a high level like - Financial Planning, Consolidation and Reporting
Sub processes are the blocks that make up the process. In Planning, we call them models. Below are the few.

    Profit and Loss Model
    Balance Sheet Model
   Cash Flow Model
   WorkForce Model
   Capex Model
   Feasibility Model

8. What do you mean by grain and how to identify?

Grain/ Granularity is the lowest level of detail. Defined at process and mapped to the cube/Plan Type based on the processes that sit in the respective Plan Type.

Plan Type is technically a vessel that contains one or multiple processes/ Models.
Your dimensions are relevant to the process, not to the Plan Type. If a Plan Type contains multiple process dimensional irrelevance arises.

for e.g. Profit and Loss Model, Balance Sheet Model and Cash Flow Model are part of "Finance" Plan Type. Product dimension is must in Profit and Loss Model to derive the Product level profitability, where as not required in the other two.

Thus, Product dimension is relevant in P&L and not in other. Here, you should maintain this dimension in the other two also and associate with a default member AKA, "All Products" or "No Product".

Let's come back to the "Grain"; Say Product dimension detail is as below from Top to Bottom

Product Total -> Product Line->Product ID->SKU. Here, SKU is the grain of the product master in the organisation. In the EPM, we may define "Product ID" as the leaf. Likewise, combination of leaf level from all dimensions is the grain of the Plan Type. This is the input/ load level, and similar to segment string in ERP.

9. How do you identify the facts?

First, let me explain what the Facts mean. Facts are measures in Analytical Models - like
"No. of Deals"
"Value of Deals in $"
"No. of unsuccessful deals"
"Value of Lost Deals in $"
These are generally BI Models. Key is to go as detail as available to you.

But, In Financial models, Facts are Accounts used to prepare Financial Statements - like revenue, expense, Asset, Liability etc.Here, in this case, facts are Lev0 members of the Account dimension in EPM.

is there a standard to identify this level? Yes, we have techniques not standards.
It's driven by the requirements a.k.a Business determines the solution, but not otherwise.
We must consider the below factors, when doing the P&L Hierarchy
1. Actual entry point in ERP or any Accounting Information Systems.
2. Budget entry point in Planning Solutions - existing and new
3. Reporting Levels - Summary and Detail
4. Face of the P&L and Notes

When It comes to Balance Sheet and Cash Flow, I consider the Face of the Financial Statements and a place to capture the movements. I consider sub-models with the assumption accounts to arrive at a specific element as needed. Say - Fixed Asset Block for PPE and Depreciation.

***********************
Ohh. Dad! I have finished the first part...
Am I really done? Yes for the first part with in the Conceptual and Logical layer. Anything, I write more, sure to overstep into Physical layer.
Is it as beautiful as I wanted to? Mind is an amazing thing. I am not an artist to paint all thoughts beautiful. I am Tech, Just a Tech!

I will try to be a better in the second part. And, and, what a feeling it is, to finish a work on Sunday before the lights on.
***********************



Saturday, August 4, 2012

Block Creation Issue–How Essbase Thinks

Hi,

I have written many calcs in the Essbase for as many different business cases with different volumes up to 100 gigs  . Most challenging issues that I have faced in the Essbase world of calculations w.r.to data integrity or performance somehow directly or indirectly related to block creation issues.
So I would like to bring some of experiences in this regard into this article on this cloudy Hyderabad Saturday evening.

First we will see when the blocks are created:


Data Load-- datablocks are created when data is loaded to sparse member combinations hat did not previously exist. The blocks are created upon input whether loaded using a source file and load rule or loaded from a spreadsheet using lock and send.

Sparse Rollup-- data blocks are created when sparse dimensions are rolled up as specified in the outline using a CALC ALL, CALC DIM or other calculation function.

Datacopy-- datablocks are created when the DATACOPY command is executed in a calc script.

Member Formula-- datablocks may be created under certain circumstances as a result of a member formula. eg: Bud=Act+Act*0.1                                                                                                                                                                                                                                                 This functionality is enabled when the Create Blocks On Equations checkbox is checked in the Calculation Group of the Database Settings dialog or SET CREATEBLOCKONEQ ON/ SET CREATENONMISSINGBLK ON in the calc script.

--- First three cases blocks are created natural with no effort. And the fourth one is troublesome and this whole conversation is all about.

Why we are worried about this?

 1) Logic seems good. Code seems good. But it will create severe data integrity issues.
 2) Some times it will not be found in the testing also.
 3) A change which was implemented in the middle of the year works good till Dec and suddenly it  will  create issue in Jan as the year got changed. (In a case where period is dense and year is sparse)
 4) Complex matter in Essbase to digest and handle even for the Consultants who had deep experience and understanding of Essbase.
 5) Performance will become very worse If we apply the known remedies as a safety measure in all the possible cases with out identifying the existence of issue.  

What Essbase DBAG says:


Constant Values
You can assign a constant value to a member: Florida = 120;
In this formula, California is a member in a sparse dimension and 120 is a constant value. Analytic Services automatically creates all possible data blocks for California and assigns the value 120 to all data cells. Many thousands of data blocks may be created.

--- This is also clear. No Issues in this point also.

Non-Constant Values
If you assign anything other than a constant to a member in a sparse dimension, and no data block exists for that member, new blocks may not be created unless Analytic Services is enabled to create blocks on equations.

For example, to create blocks for West that didn’t exist prior to running the calculation, you need to enable Create Blocks on Equations for this formula:
Florida = “New York” + 120;

You can enable Create Blocks on Equations at the database level whereby blocks are always created, or you can control block creation within calculation scripts.

Will study this in detailed
Sample/Basic (Most Popular and Favorite) is the cube for our study.
We all know what’s dense and what’s sparse in this cube.
Dense: Measures, Year and Scenario | Sparse: Market and Product

Have loaded the sample data calcdat.txt and cleared the blocks for Folrida using the below calc.
//ESS_LOCALE English_UnitedStates.Latin1@Binary
Fix(Florida)
ClearBlock All;
Endfix

After that below is the snapshot of data where all the blocks for Florida got cleared.

image

See below that Global Database setting “Create blocks on equations” set to Off. (This is the default).
image

We will write a calc to calculate the data for Florida from that of New York with out Create blocks on equations turned on in the calc.
//ESS_LOCALE English_UnitedStates.Latin1@Binary
Fix(Actual)
Florida="New York"+120;;
Endfix

This calc should not work as defined in the DBAG as there is no blocks for Florida.
Run this calc and see the results. Surprisingly we can see the results as per the calc logic.
image

Is this constant value (120) is making magic even in the formula.
Clear again the blocks for Florida.
image

Will write a calc with formula with two non constant operands and explicitly turn off the create blocks on equations in the calc.
//ESS_LOCALE English_UnitedStates.Latin1@Binary
SET CREATEBLOCKONEQ OFF;
Fix(Actual)
Florida="Utah"+"West";
Endfix
Run this calc and see the results. Calc is done again.
image

Is the block creation issue a myth? Shall we conclude there is no block creation issue?

No. No. It is there and it should be there. Otherwise we don’t have any work.
Will clear again and rewrite the calc with dense member formulae for the target of Florida where blocks doesn’t exists.

//ESS_LOCALE English_UnitedStates.Latin1@Binary
Fix("Actual","Florida")
"Sales"="Utah"->"Sales" + 120;
"COGS"="Utah"->"COGS" + 120;
Endfix
Sales and COGS for Florida are expected to be calculated from that Utah. Nothing calculated here.

image

Happy to see that Block creation issues is there. What happened?

By default, Essbase applies dense-member formulas only to existing data blocks. As there exists no blocks for Florida, no calculation happened here.
How this can be resolved? By updating the calc with SET CREATENONMISSINGBLK ON.

//ESS_LOCALE English_UnitedStates.Latin1@Binary
SET CREATENONMISSINGBLK ON;
Fix("Actual","Florida")
"Sales"="Utah"->"Sales" + 120;
"COGS"="Utah"->"COGS" + 120;
Endfix
Issue resolved. Calculation is done.
image

What the SET CREATENONMISSINGBLK ON will do?

SET CREATENONMISSINGBLK ON enables Essbase to create potential blocks in memory where the dense-member formulas are performed. Of these potential blocks, Essbase writes to the database only blocks that contain values; blocks resulting in only #MISSING are not written to the database.
In order to create new blocks, setting SET CREATENONMISSINGBLK to ON requires Essbase to anticipate the blocks that will be created. Working with potential blocks can affect calculation performance.
The Create Blocks on Equations setting is ignored when this setting turned on.

Is this good or any issues with this?

It will create potential blocks in memory even though it will write and store only the non missing blocks after calculation. This is a very resource intensive algorithm that will hamper the performance of the calc.

Better approach is to rewrite this as a sparse member formula.

//ESS_LOCALE English_UnitedStates.Latin1@Binary
SET CREATEBLOCKONEQ OFF;
SET MSG SUMMARY;
Fix("Actual","Sales", "COGS")
"Florida"="Utah"+ 120;
Endfix
Clear the blocks again and run this calc.
With SET CREATEBLOCKONEQ OFF, this calc should not work. But the results are fine exactly matching with that of dense formula by SET CREATENONMISSINGBLK ON.

image

As per Essbase DBAG, Essbase will not create a block when a formula assigns a non-constant value to a sparse dimension member for which no block exists with SET CREATEBLOCKONEQ OFF.

Will see the application log
[Sat Aug 04 14:53:59 2012]Local/Sample/Basic/admin@Native Directory/8168/Info(1200481)
Formula for member [Florida] will be executed in [TOPDOWN] mode

[Sat Aug 04 14:53:59 2012]Local/Sample/Basic/admin@Native Directory/8168/Info(1012668)
Calculating [ Market(Florida)] with fixed members [Measures(Sales, COGS); Scenario(Actual)]

[Sat Aug 04 14:53:59 2012]Local/Sample/Basic/admin@Native Directory/8168/Info(1012672)
Calculator Information Message:

Maximum Number of Lock Blocks: [100] Blocks
Completion Notice Messages: [Disabled]
Calculations On Updated Blocks Only: [Enabled]
Clear Update Status After Full Calculations: [Enabled]
Calculator Cache: [Disabled]

[Sat Aug 04 14:53:59 2012]Local/Sample/Basic/admin@Native Directory/8168/Info(1012677)
Calculating in serial

[Sat Aug 04 14:53:59 2012]Local/Sample/Basic/admin@Native Directory/8168/Info(1012672)
Calculator Information Message:

Total Block Created: [0.0000e+000] Blocks
Sparse Calculations: [1.8000e+001] Writes and [7.2000e+001] Reads
Dense Calculations: [0.0000e+000] Writes and [0.0000e+000] Reads
Sparse Calculations: [0.0000e+000] Cells
Dense Calculations: [0.0000e+000] Cells

[Sat Aug 04 14:53:59 2012]Local/Sample/Basic/admin@Native Directory/8168/Info(1012579)
Total Calc Elapsed Time for [CALF0.csc] : [0.144] seconds

[Sat Aug 04 14:53:59 2012]Local/Sample/Basic/admin@Native Directory/8168/Info(1013274)
Calculation executed


See from the log, Formula calculation is happening in Top-Down for Florida.
Essbase uses one of two calculation methods to do a full calculation of a database outline: bottom-up calculation or top-down calculation. By default, Essbase does a bottom-up calculation.
If the database outline contains a complex member formula, Essbase performs a top-down calculation for the relevant member.

Will Force the Formula bottom up and see the results again.
//ESS_LOCALE English_UnitedStates.Latin1@Binary
SET CREATEBLOCKONEQ OFF;
SET FRMLBOTTOMUP ON;
SET MSG SUMMARY;
Fix("Actual","Sales", "COGS")
"Florida"="Utah"+ 120;
Endfix
Clear again the blocks for Florida and run this calc. See nothing is done. No blocks created even for the sparse member formula.

image

[Sat Aug 04 15:01:24 2012]Local/Sample/Basic/admin@Native Directory/7628/Info(1012668)
Calculating [ Market(Florida)] with fixed members [Measures(Sales, COGS); Scenario(Actual)]

[Sat Aug 04 15:01:24 2012]Local/Sample/Basic/admin@Native Directory/7628/Info(1012672)
Calculator Information Message:

Maximum Number of Lock Blocks: [100] Blocks
Completion Notice Messages: [Disabled]
Calculations On Updated Blocks Only: [Enabled]
Clear Update Status After Full Calculations: [Enabled]
Calculator Cache: [Disabled]

[Sat Aug 04 15:01:24 2012]Local/Sample/Basic/admin@Native Directory/7628/Info(1012677)
Calculating in serial

[Sat Aug 04 15:01:24 2012]Local/Sample/Basic/admin@Native Directory/7628/Info(1012672)
Calculator Information Message:

Total Block Created: [0.0000e+000] Blocks
Sparse Calculations: [0.0000e+000] Writes and [0.0000e+000] Reads
Dense Calculations: [0.0000e+000] Writes and [0.0000e+000] Reads
Sparse Calculations: [0.0000e+000] Cells
Dense Calculations: [0.0000e+000] Cells

[Sat Aug 04 15:01:24 2012]Local/Sample/Basic/admin@Native Directory/7628/Info(1012579)
Total Calc Elapsed Time for [CALF0.csc] : [0] seconds

[Sat Aug 04 15:01:24 2012]Local/Sample/Basic/admin@Native Directory/7628/Info(1013274)
Calculation executed


See the log. No Top-Down Calculation and no read and no write.

Turn on the Create Blocks on equation and see the results.
//ESS_LOCALE English_UnitedStates.Latin1@Binary
SET CREATEBLOCKONEQ ON;
SET FRMLBOTTOMUP ON;
SET MSG SUMMARY;
Fix("Actual","Sales", "COGS")
"Florida"="Utah"+ 120;
Endfix

image

[Sat Aug 04 15:09:16 2012]Local/Sample/Basic/admin@Native Directory/6768/Info(1012668)
Calculating [ Market(Florida)] with fixed members [Measures(Sales, COGS); Scenario(Actual)]

[Sat Aug 04 15:09:16 2012]Local/Sample/Basic/admin@Native Directory/6768/Info(1012672)
Calculator Information Message:

Maximum Number of Lock Blocks: [100] Blocks
Completion Notice Messages: [Disabled]
Calculations On Updated Blocks Only: [Enabled]
Clear Update Status After Full Calculations: [Enabled]
Calculator Cache: [Disabled]

[Sat Aug 04 15:09:16 2012]Local/Sample/Basic/admin@Native Directory/6768/Info(1012672)
Calculator Information Message:

Create Blocks on Equations: [Enabled]
[Sat Aug 04 15:09:16 2012]Local/Sample/Basic/admin@Native Directory/6768/Info(1012677)
Calculating in serial

[Sat Aug 04 15:09:16 2012]Local/Sample/Basic/admin@Native Directory/6768/Info(1012672)
Calculator Information Message:

Total Block Created: [3.6000e+001] Blocks
Sparse Calculations: [1.8000e+001] Writes and [3.6000e+001] Reads
Dense Calculations: [0.0000e+000] Writes and [0.0000e+000] Reads
Sparse Calculations: [0.0000e+000] Cells
Dense Calculations: [0.0000e+000] Cells

[Sat Aug 04 15:09:16 2012]Local/Sample/Basic/admin@Native Directory/6768/Info(1012579)
Total Calc Elapsed Time for [CALF0.csc] : [0.123] seconds

[Sat Aug 04 15:09:16 2012]Local/Sample/Basic/admin@Native Directory/6768/Info(1013274)
Calculation executed


Log shows - Create Blocks on Equations: [Enabled]. This helped in getting the calc done.

What SET CREATEBLOCKONEQ ON will do? What Technical Reference says?


If calculations result in a value for a sparse dimension member for which no block exists, Essbase creates a block. Sometimes, new blocks are not desired; for example, when they contain no other values. The Create Blocks on Equation setting is designed for situations when blocks would be created as a result of assigning something other than a constant to a member of a sparse dimension.

What actually it is? Which is better CREATEBLOCKONEQ/ CREATENONMISSINGBLK ?


This will create blocks if and only if the RHS expression can result in a value. Will create blocks automatically if the Sparse Formula calculation is default Top-Down with out Create Blocks on Equations: [Enabled].

Create Blocks on Equations should be enabled for the specific piece of calc having sparse member formula if the calc is globally forcing the formula bottom up or Essbase cfg setting CALCOPTFRMLBOTTOMUP set to TRUE to optimize the sparse calculations.

CREATEBLOCKONEQ/ CREATENONMISSINGBLK – Both these settings have the performance impact because of the processing overhead. But CREATEBLOCKONEQ is better compared to CREATENONMISSINGBLK as the earlier creates only the required blocks but the later creates potential blocks while calculating but store only the non missing blocks.  CREATENONMISSINGBLK  will have much impact on the calc performance.

Hope covered as much as I can. Feeling tired. Logging out here. Thanks.

Thursday, August 2, 2012

FDM - Interface to load data into Essbase

This article describes the interface development to load data into Essbase using FDM.

Technologies used in this article are
1) ORACLE database 11g
2) Hyperion Foundation Services 11.1.2.2
3) Hyperion Essbase 11.1.2.2
4) Hyperion Essbase Administration Services 11.1.2.2
5) Hyperion FDM 11.1.2.2
6) Essbase Excel Add-in 11.1.2.2

Preparation: Done the below preparatory work before proceeding with interface development.
1) Created a schema in Oracle Database as required to create a FDM application.
2) Modified the Sample/Basic application as FPLN/PNL with three additional dimensions Year, Version and Amt_Type. Renamed the Measures as Account.


Introduction: Spent a couple of hours before coming up with a sequence that keeps us in track of the interface development. After that every thing went cool.

The following steps outline the process for building a Hyperion FDM Interface to load data into an Essbase application:
Step 1: Create a new Hyperion FDM application
Step 2: Integrate the new application with the target system
Step 3: Populate the Hyperion FDM control tables
Step 4: Create the Hyperion FDM locations
Step 5: Import maps for each location
Step 6: Build import formats
Step 7: Test data loading
Step 8: Validate


clip_image002[4]
Flow Chart – To build an FDM interface for data loading.

Step 1: Create a new Hyperion FDM application

First step is to create a new FDM application.
Log onto Hyperion FDM using the <new application> option.
clip_image004

Enter the below details.
Application Name: ESSINT
Description: Essbase Integration
UNC path: \\WIN-OZCO68Q3T50\FDMData\
App Group: FDM
clip_image006

Enter the Database details:
OLEDB Provider: ORAOLEDB.ORACLE
Service: HYP001
Username: FDMESS
Password: xxxxxxxx

clip_image008

clip_image010

Step 2: Integrate the new application with the target system


Open the workbench and Login to the newly created application ESSINT
clip_image002

Import the Essbase Adaptor


clip_image004[4]

Browse to find the file ES11X-G4-J.xml and click Open.
clip_image006[4]
clip_image008[4]

Register Adaptor
clip_image010[4]

Browse to find the file named upsES11XG4J.exe and click Open.
clip_image012

Right Click on the Adaptor and Select Configure.
In the Machine Profiles Tab - Enter the Source and Target Machine details.
Here it is : WIN-OZCO68Q3T50
In the Hyperion Essbase Integration Tab - Enter as below
Essbase Application Name: FPLN
Essbase DB Name: PNL
Logon Method: Select 2-Global
Username: hypadmin
Password:xxxxxxxx
clip_image014

In the Dimensions Tab - Make the required dimensions active.

Step 3: Populate the Hyperion FDM control tables:

Control tables are used to supply the values displayed in system options. These tables allow the system administrator to control the system options and POV values that users can select.
Open the Metadata->Control Tables.
Add Periods. This will supply the Period and Year to the Target Essbase database.
The Period control table allows for the definition of the valid fiscal periods available to Hyperion FDM. The values in this table provide the list of possible periods that can be selected from the Period POV link. Changing the Hyperion FDM period will change where data is loaded in the target system.

clip_image016

Add Categories
The Categories Table contains the definition of data categories. These categories represent “buckets” (Scenarios) in which data can be loaded.

clip_image018

The values in this table provide the list of possible categories that can be selected from the Category POV link. Each Hyperion FDM category is associated with a target system’s scenario. Changing the Hyperion FDM category will change where data is loaded in the target system.

Verify the Currency Codes
The Foreign Currency Table contains a list of currency codes. Currency codes are assigned to locations. The currency code will be displayed on reports. This code does not impact any calculations, and is used for notation purposes only.

clip_image020

Step 4: Create the Hyperion FDM locations

Hyperion FDM Location Types
Hyperion FDM maintains two types of locations; Data Load and Controls Review.
Data Load — location that accepts data from a source system and loads data via mapping rules to a target system. A data load location can also be assigned financial controls. Data load locations can only be children in the controls structure, they cannot be parents.
Controls Review — location assigned only to financial controls. No data is loaded from a controls review location. Controls review locations are typically parents in the controls structure. A controls review location can also be a child of another controls review location.
clip_image022

Data loading locations have maps and are assigned an import format. Optionally, each data loading location may be assigned a logic group and validation rules.

Step 5: Build/ Import maps
The purpose of a mapping table is to map the source dimension members to their corresponding target system dimension members. A map can be loaded for each dimension in the target application defined in Hyperion FDM.
Verify the Mappable Dimensions - Metadata-Dimensions.
Here dimensions enabled for mapping are - Account, Region, Product, Amt_Type and Version.

clip_image002[4]

Define the Maps from Activities->Maps
Set the Source to Target Mapping for Accounts as below.
clip_image004[6]

Set the Source to Target Mapping for Entity(Region) as below.
clip_image006[6]

Set the Source to Target Mapping for Product as below.
clip_image008[6]

Step 6: Build import formats

Each data load location must be assigned an import group that will be used when importing data files. Import groups instruct Hyperion FDM how to interpret data files and transform them into data that can be loaded into the Hyperion FDM data structure.
The top grid shows import groups, and the bottom grid is where the fields for a group are defined.
By creating an import format, Hyperion FDM is able to interpret the file layout. Once the import format is created, it is important that the source file layout be unchanged each time you load data. If the file format of the source file changes, the import format will need to be updated.

clip_image002[6]
clip_image004[8]

Assign the Import Format to the Location - Sample here.


clip_image006[8]

Step 7: Data loading

Having the mapping tables loaded and the import formats created has prepared Hyperion FDM to import the month balance file and load it into the target system. The following steps will be repeated every month by data loaders to load the source data into the target system

Import source file
clip_image002[8]

Validate
clip_image004[10]

Validation got failed as there is no maps build for Amt_Type and Version. Initially have assumed that no mapping required if the source and target has the same value. Mapping details are must for all mappable dimensions.

Build the like Mapping for Amt_Type dimension as * to Initial.
There are four mapping types for each dimension -- Explicit, Between, In, and Like.

clip_image006[10]

Build the like Mapping for Version dimension as * to Initial.

clip_image008[8]

Again Go to Workflow and Validate. This time it is Success. See the Gold Fish.

clip_image010[6]

Click on Export

Choose "0-Replace"

clip_image012[4]

Export is success. see the Gold Fish.
clip_image014[4]

Validate using Essbase Excel Add-In.
clip_image016[4]

Validation is Successful.

Special Notes:

1) Not convinced with the provided mapping mechanism in the case if target is same as source. that should be simple with an option like equal to choose. Probably I haven't got that to make it work simple.
2) Not satisfied with performance of data load compared to native Essbase tools such as EAS/ MaxL.
3) Relatively easy once we are with the correct sequence/ process flow.
4) Essbase Integration with Essbase Adaptor also much like how it works for HFM data integration. It should have been simple based on how essbase can receive data.

Will see how we can tap the abilities of FDM in a much better way some time after. Thanks.