Hi all,
Hope all are doing good.
I want to unlock a process template in bpf,previously it was locked by someone else.
Version bpc10 sp17
Can anyone one help me??
Thanks,
kiran.
Hi all,
Hope all are doing good.
I want to unlock a process template in bpf,previously it was locked by someone else.
Version bpc10 sp17
Can anyone one help me??
Thanks,
kiran.
Hi All,
How to handle fixet assets with historical values in local currency for consolidation?
Is historical values for asset comes in Trial Balance (TB) of current period ?
Could you please highlight the steps in this process ?
Thanks,
Raja
Hi All,
We are implementing the legal consolidation 10.1 NW. I have a scenario, we have 10 entities having mulitiple rate types
like AVG,END and HIST types.
So, out of 10 entities 5 entities for one GL code having rate type HIST and another 5 entities having rate type END.
My question is that how can I do the traslation for this.
Example: ENTIY 1 - 5 Entity 5 - 10
Rate Type Rate Type
ID/Dec
Packaged Goods HIST END
In the above case how to handle to accomplish the currency conversion. Please advise me.
Regards,
Raja
SAP EPM 10 - NW
How to generate a report by a dimension property without having the dimension in row axis?
In the example report shown in attached mockup , i need a report generated by product class (suppressed for zero) summarized on all products
Here you can find the result of an investigation I have done on SAP BPC write back parameters that can be maintained as model parameter for planning and consolidation in SPRO transaction (a general description can be found on BPC admin guide).
RECLEVEL_NR specifies the maximum number of records to be written for which a record-based locking is always applied.
This is quite important because a wrong setting of this parameter can affect performance and overload the en-queue server by generating a huge number of locking selection range.
In case the number of submitted records in a BPC input form or in the result of a Data Manager Package exceeds the value of RECLEVEL_NR, we will have 2 different behaviors:
The choice of these 2 scenarios in theory should depend from parameter SPARSITY_COEF, but in implementation I checked the sparse behaviour was never possible, no matter what the value of SPARSITY_COEF is.
This is screenshot is taken from BPC release 801 SP level 12.As a result of these 2 scenarios we can have:
NON-SPARSE records
The locking selection is created using BPC parameter INTERVAL_NR. From BPC admin guide we know that:
"In the situation where record level locking is not being implemented and the data set being saved is NOT sparse, any dimensions with less than this number of distinct member values in the dataset will be locked using their single values. If the dimension has more than this number of records, the range between the low to high values will be locked."
So if the members of a dimension are less or equal to INTERVAL_NR, record based locking for that dimension will be implemented; otherwise the locking will be done using an interval that considers only the lowest and highest member.
E.g.
If I am going to write 3 members for Time (2016.01,2016.02 and 2016.12) and INTERVAL_NR is 2, the interval will lock every time member from 2016.01 to 2016.12.As a consequence:
SPARSE records (actually in my current version is never implemented).
The locking selection is created using BPC parameter MULTIPLY_COEF.
"In the situation where record-level locking is not being implemented and a sparse data set is being saved, this value specifies the maximum number of members for which you can implement record level locking (that is, when to swap to using a BETWEEN range in the lock table)."
In this case it will apply the range selection by considering the number of members in every dimension.
E.g.
See below example where the model has 4 dimension and MULTIPLY_COEFF is equal to 15. Please note that the table is a sorted table by number of members in each dimension: so category (1 member) is first and account (7 member) is the last.
Conclusion
I would advice to keep parameter RECLEVEL_NR to default value 10 and to not increase it.
Decision of type of locking to be implemented in this way will depend from parameter INTERVAL_NR (I assume that we are always in a NON SPARSE scenario).
Impact on parallelisation
When a parallel BPC process(RUNLOGIC_PH or BPC parallel framework) is implemented, it is highly recommended to have a low value of RECLEVEL_NR.
RUNLOGIC_PH with a RECLEVEL_NR equal to PACKAGE_SIZE usually has some process that fails because of an overloading of the en-queue server.
Below you can see the result of some test I have done on our BPC system:
RECLEVEL_NR = 40000 and INTERVAL_NR = 10 -> some process failed. It took ~14 minutes
RECLEVEL_NR = 10 and INTERVAL_NR = 10 -> Succeeded. It took ~8 min 30s
RECLEVEL_NR = 10 and INTERVAL_NR = 1000 -> Succeeded. It took ~8 min 40s.
Hello experts.
For a very standard need of testing, I had to pick some datas in prod environment and import it into the quality environment.
I've run an export on a short scope of data, the resulting file size is around 80 Mo once downloaded on my workstation.
When I upload the file on the quality server (datamanager, upload data) I got the confirmation message that file was copied successfully.
After having run the import package (successfully) my global amount of data is different from qual and prod (of course I've cleared qual before).
A little bit of investigation shows that the file uploaded on qual server is incomplete. When I "re-download" it from QUAL, and compare it to the initial one the size is different -> the initial file is twice bigger.
I've finally splitted my 80Mo file into 4 pieces, and each piece was uploaded and imported with no issue on Preprod.
So it appears that size matters finally.
I've checked the parameter ALLOW_FILE_SIZE in SPRO, the initial value was 104 857 600 (which means around 100 Mo I think). It should have been enough, but even 204 857 600 changes nothing.
My test datas are in place so the issue is solved, but I also like to understand what happens sometime. Do you have any ideas about this strange behaviour ? is there another parameter somewhere that may explain the system refuses "big" files ?
System :
CPMBPC | 801 | 0008 | SAPK-80108INCPMBPC | CPM Business Planning and Consolidation |
EPM addin : 10.0 SP25 .NET 3.5
Thanks in advance for your advices.
G.
Hi all,
can you please advise how to modify MOVE data manager package which will add (instead of overwrite) values from the source to the target and then the source records will be deleted?
Thanks
David
Hello experts,
I want to import my Planning Shell from DEV system to PROD with the same technical IDs of BW objects.
I tried to restore the shell using UJBR but failed with the error :
Restoring environment PLANNINGSHELL1 with original technical name not possible.
Should I use a BW transport ?
Thanks
Maha
Dear Collegue,
it is my first time with BPC embedded, and we have some problem to link the Analysis Office Workbook with me planing model.
i found a link that explaint that, in the last point explained how to do to link Analysis Office Workbook to planning model, the first problem is that the guide says that i have to change the registry, when i executed the REGEDIT and look for the KEY_CURRENT_USER\Software\SAP\AdvancedAnalysis\Settings\Planning i dont have this data.
Then when i open my workbook and go to tab Analysis. Select button Display Design Panel. On design panel go to Components and tab Planning. But when i look for my plan model i dont found any model.
Could you help me.
Regards
Hi experts,
I have a requirement a bit strange, I have been asked to remove the incoming data for one ACCOUNT prior to 2016.FEB in a report. In other words they'd like to be able to see YTD data for which ever period without any data from JAN.2016 "downwards" just FEB.2016 onwards......AND JUST FOR THAT PARTICULAR ACCOUNT? How can I achieve this? any help will be greatly appreciated
Hi all.
I have created an export from BPC NW (10.1) to CSV.
However, the issue that I am having is that when we import from ECC, we receive some columns as a blank entry.
This is then converted during import to PC_NONE
Any valid entries are then prefixed with "PC" before being imported to BPC
As the export that I am writing needs to be in ECC format, I need to output either blanks or the BPC entry, minus "PC".
This gives me a problem, as when I run the export, it fails, telling us that BPC cannot export blank entries.
I thought of sending a {space}, but this will cause issues as {space} is not {blank}.
I have tried several methods:
Conversion file:
js: if(%external%="PC_NONE" then ""; else %external%.replace(/^PC/,""))
js: if(%external%.toString()="PC_NONE" then ""; else %external%.replace(/^PC/,""))
js: if(%external%="PC_NONE" then %external%.replace("PC_NONE",""); else %external%.replace(/^PC/,""))
I have even tried a 2-part conversion:
TRANSFORMATION:
PROFITCENTRE = *IF(PROFITCENTRE=*STR(TP_NONE) THEN *STR(); PROFITCENTRE)
Plus
CONVERSION:
js: %external%.replace(/^PC/,"")
As I said earlier though, when I do manage to get it to generate a blank entry, BPC throws an error, as it refuses to export a blank.
Has anyone ever come across and worked around this?
PS - A colleague has suggested exporting everything and then running an exit-BADI to clear the PC_NONE entries, which is an option, but I would like to have everything done in transformation and conversion if possible.
Thanks
Craig
Hi,
During Restore with UJBR I get "Security Data Load ended in error" I found OSS Note 1927908 to get around the problem, but this only removes the security data from being backed up. How would I go about restoring the security data?
Thanks for any help,
Dear ,
After triggering of any DM package we are observing 10 min delay response. After 10 min it actually starts running. Do you know is there any configuration parameter needs to be adjusted.
Although I have cleared all DM log and data files.
Thanks in advance
Suvendu
Hi Experts,
For BPC 10.1, what is the reporting and web add in for IPAD..? Up to BPC 10.0 for IPAD EPM Unwired is the option.
Please guide me for SAP BPC 10.1 What is the Add in for IPAD..? or it is moved to Fiori app.?
Thanks in Advance
Ramakrishna
Hello Experts,
I am back again with some more questions and would like you expert opinions on my queries.
In my model I have made use of the Time Dimension which basically is a Hierarchy -
Time -
FISCTM1
April
May
June
July
August
September
FISCTM2
October
November
December
January
February
March
Now for exclusively selecting Dimension members in the report, I have made use of the Context Options and also using Page Axis with Dimension Override functionality.
With Dimension Override by referencing the Page Axis cell, I have tried accessing more than one Time Period, April, May and June, unfortunately for me, I am not able to see the data for all three periods; But when I just select one Period, April, I am able to see the data accordingly for Actual, Plan and Forecast.
Please find attached screenshots -
1. This is displaying the Cost Center (also a hierarchy); accessed from Page Axis, Category (Actual, Plan and Forecast) and the Time Period.
Time Period - April (Selected from Page Axis) - April.jpg
2. Next I have selected multiple Time Periods for which the Sheet refreshes as per screenshot attached. Multisel.jpg
As seen in the screenshots, when multiple periods are selected, the data is not displayed but the Time Period references the default Page Axis value, FISCTM1.2015.
Kindly advise how to get the data for the selected members from the Page axis in the report where Dimension members are overridden using page axis cell as reference.
Also, I would like to know if it is possible, as seen in the above Hierarchy display of Time, when the Page Axis selection is made with one of the Time Periods for a Fiscal Term, that all the other Periods before the selected period for that Term can be displayed or not automatically in the report.
Example - If June is selected for FISCTM1 - then April, May and June should be displayed, likewise if August is displayed, April, May, June, July and August is to be displayed.
Looking forward to your favorable responses.
Regards,
Sachin
Hi experts,
I have experience differences in results comparation in reports running same script logic in BPC 7.5 & BPC 10,1
We migrated BPC 7.5 (CPMBPC 750 SP0015) to BPC 10.1: (CPMBPC 810 SP 0009)
Logic script engine enabled is Java script .
The script logic has as objetive to calculate Price. formula is clear: total sales / quantity
V02A (price) = V03 (total sales) / V01 (Quantity)
The following is the description of the model and Logic Script
VENTAS MODEL has 9 dimensions:
1.MONEDA (Currency type),
2.CONCEPTO (Account type),
3.CLIENTE (User type)
4.COST_CENTER (User type),
5. PRODUCTO (User type),
6. FUENTE (DU member),
7. VERSION (Category type),
8. SOCIEDAD(Entity type)
The script logic is running on BPC 7.5 by launching DM
ALL account members are INC in ACCTYPE propertie
the following script has been run many times for about 4 years and works:
*LOOKUP VENTAS
*DIM CANTIDAD:CONCEPTO="V01"
*DIM MONEDA="LC"
*DIM MEASURES="PERIODIC"
*ENDLOOKUP
*XDIM_MEMBERSET CLIENTE=100000065_1000
*XDIM_MEMBERSET PRODUCTO=6000075
*XDIM_MEMBERSET SOCIEDAD=1000
*XDIM_MEMBERSET FUENTE=DU
*XDIM_MEMBERSET MONEDA=LC
*XDIM_MEMBERSET COST_CENTER=CO_1000
*XDIM_MEMBERSET CONCEPTO=V03
*XDIM_MEMBERSET VERSION=B00_PLAN
*XDIM_MEMBERSET TIEMPO=2012.ENE
*WHEN CONCEPTO
*IS V03
*REC(EXPRESSION=(%VALUE%*-1)/LOOKUP(CANTIDAD),CONCEPTO=V02A)
*ENDWHEN
This is the source register
divide by:
After running through DM, the following values are created from BW perspective
Running same script logic on same data source through DM, the following value is created in BW
Note that absolut value is OK in 7.5 but sign is different in 10.1.
Master data keep same ACCTYPE attribute value
The problem is solved by changing *REC VALUE SIGN:
*REC(EXPRESSION=(%VALUE%*-1)/LOOKUP(CANTIDAD),CONCEPTO=V02A)
REPLACED BY
*REC(EXPRESSION=(%VALUE%)/LOOKUP(CANTIDAD),CONCEPTO=V02A)
Is this a bpc know issue?
Thanks for your help
Jairo
In this article I decided to accumulate some knowledge regarding default.lgf scripts.
Purpose of default.lgf:
To perform calculations triggered by user data send by some input schedule or journal. It can be also launched (if user selects the option) at the end of some standard DM chains (Copy, Move, Import, etc..).
For DM chains like DEFAULT_FORMULAS used to run scripts the default.lgf is NOT triggered.
Scope of default.lgf
When launched the default.lgf will receive scope as a combination of all members of all dimensions of data sent by user and actually saved to the cube. If some records are rejected by write back or validation badi then the scope of default.lgf will not contain the rejected members combination.
Example:
Dimension: DIM1
Members: D1M1, D1M2
Dimension: DIM2
Members: D2M1, D2M2
Input form (all intersections in the cube has value 1):
The user decided to change value in the cells marked with yellow to 2:
2 values (D1M1,D2M1) and (D1M2,D2M2) will be sent to the cube.
As a result the scope will be a combination of the following members: D1M1,D2M1,D1M2,D2M2
Generating 4 possible combinations:
Sent by user: (D1M1,D2M1); (D1M2,D2M2) and extra: (D1M2,D2M1) and (D1M1,D2M2)
4 values will be processed by default.lgf.
If the default.lgf is like:
*WHEN ACCOUNT //or any dimension
*IS * //any member
*REC(EXPRESSION=%VALUE%+1)
*ENDWHEN
The result will be:
It means, that some extra combinations of members will be processed by default.lgf, not only changed data.
General rules:
1.Don't use *XDIM_MEMBERSET/*XDIM_ADDMEMBERSET in the default.lgf, do not redefine the scope. The original scope (not huge by the way) have to be processed.
2.Use *IS criteria in *WHEN/*ENDWHEN loop to select members for some calculations.
Sample:
For DM package script the code is like:
*XDIM_MEMBERSET SOMEDIM=%SOMEDIM_SET% // member from user prompt - MEMBER1 or some fixed member
*WHEN SOMEDIM
*IS * // scoped in *XDIM_MEMBERSET
*REC(...)
*ENDWHEN
For default.lgf the code will be:
*WHEN SOMEDIM
*IS MEMBER1 // fixed member - condition to perform calculations in REC
*REC(...)
*ENDWHEN
3.*XDIM_FILTER can be used sometimes to narrow the scope, but the benefit of filtering against *IS is not clear.
Example:
ACCOUNT dimension contains 3 members: ACC1,ACC2,ACC3
*XDIM_FILTER ACCOUNT = [ACCOUNT].properties("ID") = "ACC1"
// The incoming scope will be filtered to ACC1 if present
*WHEN ACCOUNT
*IS *
*REC(EXPRESSION=%VALUE%+1) // +1 for ACC1
*ENDWHEN
*XDIM_MEMBERSET ACCOUNT=%ACCOUNT_SET%
// Filter is reset, %ACCOUNT_SET% contains original scope
*WHEN ACCOUNT
*IS *
*REC(EXPRESSION=%VALUE%+2) // +2 for ACC1,ACC2,ACC3
*ENDWHEN
*XDIM_FILTER ACCOUNT = [ACCOUNT].properties("ID") = "ACC2"
// The incoming scope will be filtered to ACC2 if present
*WHEN ACCOUNT
*IS *
*REC(EXPRESSION=%VALUE%+3) //+3 for ACC2
*ENDWHEN
User send 1 for all 3 accounts (ACC1,ACC2,ACC3). The result is:
ACC1: 4
ACC2: 6
ACC3: 3
You also have to be on some recent SP level for *XDIM_FILTER to work correctly (read notes - search on "XDIM_FILTER")
If you have to calculate some function, like:
Result = Func([SomeDim].[Member1],[SomeDim].[Member2],..,[SomeDim].[MemberN]) (N members total)
And store the Result in some member, then you have to write N *WHEN/*ENDWHEN loops to prevent aggregation if more then 1 member is in scope. Without multiple loops the result will be multiplied M times, where M is number of different members sent by input form simultaneously.
Example (multiply 3 members):
*WHEN SomeDim
*IS Member1
*REC(EXPRESSION=%VALUE%*[SomeDim].[Member2]*[SomeDim].[Member3],SomeDim=ResultMember)
*ENDWHEN
*WHEN SomeDim
*IS Member2
*REC(EXPRESSION=%VALUE%*[SomeDim].[Member1]*[SomeDim].[Member3],SomeDim=ResultMember)
*ENDWHEN
*WHEN SomeDim
*IS Member3
*REC(EXPRESSION=%VALUE%*[SomeDim].[Member1]*[SomeDim].[Member1],SomeDim=ResultMember)
*ENDWHEN
In this example the REC line can be the same for all 3 loops (%VALUE% can be replaced by direct member reference):
*REC(EXPRESSION=[SomeDim].[Member1]*[SomeDim].[Member2]*[SomeDim].[Member3],SomeDim=ResultMember)
with minimum performance decrease.
Using LOOKUP to the same model to get expression argument member
In some cases for simple formula like multiplication of 2 members (price * qty), but with long list of members, LOOKUP can be used:
Lets assume we have members in dimension SomeDim:
Price1, Price2, Price3, Price4
Qty1, Qty2, Qty3, Qty4
Result have to be written to:
Amount1, Amount2, Amount3, Amount4
Then we can add for dimension SomeDim properties: MULT, RESULT and TYPE and fill it:
ID MULT RESULT TYPE
Price1 Qty1 Amount1 Price
Price2 Qty2 Amount2 Price
Price3 Qty3 Amount3 Price
Price4 Qty4 Amount4 Price
Qty1 Price1 Amount1 Qty
Qty2 Price2 Amount2 Qty
Qty3 Price3 Amount3 Qty
Qty4 Price4 Amount4 Qty
Code will be:
*LOOKUP SameModel
*DIM M:SomeDim=SomeDim.MULT //Get member ID stored in property MULT
*DIM MEASURES=PERIODIC //The default storage type of SameModel
*ENDLOOKUP
*XDIM_MEMBERSET MEASURES=PERIODIC //The default storage type of SameModel
*FOR %T%=Price,Qty //Or 2 loops - to prevent aggregation.
*WHEN SomeDim.TYPE
*IS %T%
*REC(EXPRESSION=%VALUE%*LOOKUP(M),SomeDim=SomeDim.RESULT)
*ENDWHEN
*NEXT
The lines *DIM MEASURES=PERIODIC and *XDIM_MEMBERSET MEASURES=PERIODIC are required for default.lgf (not required in DM package script)!
*FOR/NEXT Loops
In general long and nested *FOR/*NEXT loops have to be avoided due to terrible performance. In most cases instead of *FOR/NEXT loops some property can be created and used in the script code.
Using some value stored as property in calculations
Sometimes it looks as a good idea to store some value in a property and to use it in calculations. Actually it's a bad idea - you can't directly reference the property value in the expression, you have to use some %VAR% and long *FOR/*NEXT loop. Always store values in SIGNEDDATA, may be use some dummy members.
SIGN and ACCTYPE in EXPRESSION calculations
The calculations in default.lgf use different sign conversion logic with ACCTYPE then the script run by DM package. As a result the same script can produce different results as a default.lgf and as a script in DM package.
For default.lgf (BPC NW 10 and BPC NW 7.5) all values read in the script scope are sign converted based on ACCTYPE property and the result of EXPRESSION calculation is also sign converted based on ACCTYPE property of the target account:
SignedData_Result = if(Result.ACCTYPE=INC,LEQ, -1, 1) * Function(if(Argument1.ACCTYPE=INC,LEQ, -1, 1) * SignedData_Argument1, if(Argument2.ACCTYPE=INC,LEQ, -1, 1) * SignedData_Argument2, ...)
Example:
Dimension ACCOUNT: Members: A, B, C
ID ACCTYPE
A INC
B EXP
C INC
default.lgf
*WHEN ACCOUNT
*IS A
*REC(EXPRESSION=%VALUE%+[ACCOUNT].[B],ACCOUNT=C)
*ENDWHEN
The data sent by user in the input form will be:
A: 5
B: 10
This data will be stored as SIGNEDDATA:
A: -5
B: 10
Calculations:
(-1 * -5 + 1 * 10) * (-1) = -15 (SignedData_Result)
And on the input form:
C: 15
The same script launched by DM package (BPC NW 10) will not have any sign conversions, all calculations will be done with SIGNEDDATA values:
-5 + 10 = 5 (SignedData_Result)
The result on the report:
C: -5
For BPC NW 7.5 the sign logic in DM package is different! The formula will be:
SignedData_Result = Function(if(Argument1.ACCTYPE=INC,LEQ, -1, 1) * SignedData_Argument1, if(Argument2.ACCTYPE=INC,LEQ, -1, 1) * SignedData_Argument2, ...)
-1 * -5 + 1 * 10 = 15 (SignedData_Result)
The result on the report:
C: -15
*DESTINATION_APP
If it's required to send data to the different model the *DESTINATION_APP statement can be used in default.lgf.
Sign conversion logic is also applicable to writing data using *DESTINATION_APP.
The same rules are applicable to *WHEN/*ENDWHEN loop after the *DESTINATION_APP (by the way, in BPC NW 10 *DESTINATION_APP statement is valid only to the next *WHEN/*ENDWHEN loop, have to be repeated before each *WHEN/*ENDWHEN sending data to other application (in BPC NW 7.5 all *WHEN/*ENDWHEN loops after single *DESTINATION_APP will write to target cube).
If some dimension is missing in the destination model *SKIP_DIM=SomeDim have to be used. But the issue can be in the following case:
SourceModel:
DimMissingInTarget: Member1, Member2, ..., MemberN (base) - having root parent All
SomeDim: Mem1, Mem2, ... - dimension in both Source and Target
TargetModel:
SomeDim: Mem1, Mem2, ... - dimension in both Source and Target
If some of Member1, Member2, ..., MemberN is changed in SourceModel the result of All have to be transferred to TargetModel
The code in default.lgf of SourceModel will be:
//some calculations in the SourceModel
...
*FOR %M%=Member1,Member2,...,MemberN //list of base members of the skipped dimension
*DESTINATION_APP=TargetModel
*SKIP_DIM=DimMissingInTarget
*WHEN DimMissingInTarget
*IS %M%
*WHEN SomeDim //SomeDim - dimension existing both in Source and Target
*IS Mem1,Mem2,... //some list of members of SomeDim changed by user and to be transferred to TargetModel
*REC(EXPRESSION=[DimMissingInTarget].[All]) //Parent All value is used!
*ENDWHEN
*ENDWHEN
*NEXT
N loops for N base members of DimMissingInTarget (useful for small N)
Another option for this particular case is to explicitely scope the scipped dimension with *XDIM_MEMBERSET:
*XDIM_MEMBERSET DimMissingInTarget=<ALL>
*DESTINATION_APP=TargetModel
*SKIP_DIM=DimMissingInTarget
*WHEN SomeDim //SomeDim - dimension existing both in Source and Target
*IS Mem1,Mem2,... //some list of members of SomeDim changed by user and to be transferred to TargetModel
*REC(EXPRESSION=%VALUE%)
*ENDWHEN
But in this case you have to put this code at the end of the default.lgf or restore original scope for DimMissingInTarget:
*XDIM_MEMBERSET DimMissingInTarget=%DimMissingInTarget_SET% // %xxx_SET% variable always contains the original script scope.
Custom Logic BADI in default.lgf
It's also possible to call Custom Logic BADI in default.lgf to perform some calculations that are not easy or even not possible to implement using script logic. The badi have to work with the current scope and can contain some fixed parameters.
Example:
//Some calculations before badi call
...
*START_BADI SOMEBADI
QUERY=ON //to get records from the current scope
WRITE=ON //to use default write to cube
DEBUG=OFF
SOMEPARAM=SOMEFIXEDVALUE
...
*END_BADI // Script scope will be reset to initial script scope here if changed before
//Some calculations after badi call
...
RUNLOGIC_PH BADI
It's also possible to use RUNLOGIC_PH BADI (How To Implement the RUNLOGIC_PH Keyword in SAP... | SCN) to speed up some calculations using CHANGED parameter. For example - single change of price have to recalculate values in multiple entities and multiple time periods.
*START_BADI RUNLOGIC_PH
QUERY=OFF
WRITE=ON
LOGIC = CALLED_LOGIC.LGF
APPSET = SameEnvironment
APP = SameModel
DIMENSION ENTITY=BAS(ALLENTITIES)
DIMENSION SomeDim=%Somedim_SET% //script initial scope
...
CHANGED=ENTITY
Write Back BADI instead of default.lgf
The same functionality can be achieved by Write Back BADI - perform calculations triggered by user input. The details are described here: Calculations in Write Back BADI - Default.lgf R... | SCN
The significant difference between Write Back BADI and default.lgf is that Write Back BADI will receive data sent by user before it's stored in the cube and only sent values will be processed.
B.R. Vadim
P.S. 2014.06.11 - incorrect case about function with "+/-" removed.
P.P.S. 2014.07.23 - sample for scope added
P.P.P.S. 2014.09.25 - *XDIM_FILTER functionality described
P.P.P.P.S. 2016.02.26 - effect of write back badi on the scope of default.lgf
P.P.P.P.P.S. 2016.05.30 - LOOKUP - *DIM MEASURES=PERIODIC and *XDIM_MEMBERSET MEASURES=PERIODIC effect
P.P.P.P.P.P.S. 2016.07.06 - Sign effect for DM packages in BPC NW 7.5
We are executing the UJBR appset restore of 7.5 to 10.1 NW. We had our backup 7.5 archived file store in the server (directory accessible in AL11) and execute the restore using the background method. However once we execute the restore process, we encounter the Failed to upload file from Source. According to SAP Note https://websmp130.sap-ag.de/sap(bD1lbiZjPTAwMQ==)/bc/bsp/sno/ui_entry/entry.htm?param=69765F6D6F64653D3030312669765F7361706E6F7465735F6E756D6265723D3136343039373926 , this error encounter when the restoration process is through the Foreground option.
Therefore may I know what is the issue that cause this error even though we are executing in the background mode.
Thank you !
Hi Gurus,
We have created a Data manager package link with three DMP task assigned to it. We are able to successfully run the DMP link but we are not able to see the logs of the second and third task in DMP link. When i click on the Details of the status of the DMP run it says the log file is not present in the server. I checked UJFS and the file is not present. Please let me know why the log files are not being generated.
Regards,
Raghu.
Hi expert,
Scenario : Restoring BPC 7.5 Appset to BPC 10.1 HANA on BW 7.5
Another UJBR restore error I am encountering - this time when I execute the restore process I have encounter this error :
- Restoring across product release is not supported
I did not encounter this error when I am using the same backup 7.5 archived file to restore on to the BPC 10.1 on BW 7.4 version.
Please kindly help !
Thank you.
Regards,
Elvin See