Skip to main content

Inputs for predictive metrics in Microstrategy

Inputs for predictive metrics

A predictive metric can be created using attributes and metrics as its inputs. How you define the attributes and metrics you use as inputs for your predictive metrics affects the resulting predictive metrics, as described in:
Attributes as inputs for predictive metrics
Level metrics as inputs for predictive metrics
Conditional metrics as inputs for predictive metrics

Attributes as inputs for predictive metrics

Attributes can be used as inputs for predictive metrics. Data mining often analyzes non-numeric, demographic, and psychographic information about customers, looking for attributes that are strong predictors.
For example, your MicroStrategy project contains a Customer attribute with related attributes for age, gender, and income. You can include an attribute, such as the Customer attribute, directly in a training metric, as described in Creating a predictive model using MicroStrategy.
By including an attribute directly in a training metric, a predictive metric is then created that includes the attribute as one of its inputs. When using attributes directly in training metrics to create predictive metrics, be aware of the following:
The ID attribute form for the attribute is used by the training metric to include the attribute information in a predictive metric. If attributes include additional attribute forms other than the ID form that are to be used as inputs for predictive metrics, you can create metrics based on these attribute forms. Once these metrics are created, they can then be used as inputs for predictive metrics. This scenario for creating attribute-based predictive metrics is described in Creating metrics to use additional attribute forms as inputs for predictive metrics below.
Attribute forms must use a text or numeric data type. If the attribute form uses a date data type, the data cannot be correctly represented when creating the predictive metric. If an attribute form uses date values, you must convert the date values into a numeric format to use the attribute form to create predictive metrics.

Creating metrics to use additional attribute forms as inputs for predictive metrics

If attributes include additional attribute forms other than their ID form that are to be used as inputs for predictive metrics, you can create metrics based on these attribute forms. The resulting metric can then be used as an input for a predictive metric, thus allowing the attribute information to be included in a predictive metric.
The steps below show you how to create a metric based on an attribute form. The resulting metric, which contains the attribute information, can then be used to create a predictive metric.
Prerequisite
This procedure assumes you are familiar with the process of creating a metric. For steps on how to create metrics, see Advanced Metrics.

To create metrics to use additional attribute forms as inputs for predictive metrics

1Using the Metric Editor, create a new metric expression. All metric expressions must have an aggregation function. To support including attribute information in the metric expression, in the Definition area, type Max() to use the Max aggregation function.
2Within the parentheses of the Max() aggregation function, specify the desired attribute form using the AttributeName@FormName format, where:
AttributeName: Is the name of the attribute. If there are spaces in the attribute name, you can enclose the attribute name in square brackets ([]).
FormName: Is the name of the attribute form. Be aware that this is different than the attribute form category. If there are spaces in the attribute form name, you can enclose the attribute form name in square brackets ([]).
For example, in the image shown below the Discount form of the Promotion attribute is included in the metric.
3Add the attribute as a metric level so that this metric always returns results at the level of the attribute.
4If the predictive metric is to be used to forecast values for elements that do not exist in your project, you must define the join type for the metric used as an input for the predictive metric to be an outer join. For example, the predictive metric is planned to forecast values for one year in the future. Since this future year is not represented in the project, you must define the join type for the metric used as an input for the predictive metric to be an outer join so that values are returned.
To enable outer joins to include all data:
aSelect Metric Join Type from the Tools menu. The Metric Join Type dialog box opens.
bClear the Use default inherited value check box.
cSelect Outer.
dClick OK to close the dialog box.
5If you plan to export predictive metric results to a third-party tool, you should define the column alias for the metric used as an input for the predictive metric. This ensures that the name of the metric used as an input for the predictive metric can be viewed when viewing the exported results in the third-party tool.
To create a metric column alias to ensure the column name matches the metric’s name:
aSelect Advanced Settings from the Tools menu, and then select Metric Column Options. The Metric Column Alias Options dialog box opens.
bIn the Column Name field, type the alias.
cClick OK to close the dialog box.
6Save the metric, using the alias from the previous step as the metric name. You can now include the metric in a training metric to create a predictive metric, as described in Creating a predictive model using MicroStrategy.

Level metrics as inputs for predictive metrics

The attribute used on the rows of the dataset report sets the level of the data by restricting the data to a particular level, or dimension, of the data model.
For example, if the Customer attribute is placed on the rows and the Revenue metric on the columns of a report, the data in the Revenue column is at the customer level. If the Revenue metric is used in the predictive model without any levels, then the data it produces changes based on the attribute of the report using the predictive metric. If Year is placed on the rows of the report described previously, the predictive metric calculates yearly revenue rather than customer revenue. Passing yearly revenue to a predictive model based on customer revenue yields the wrong results.
This problem can be easily resolved by creating a separate metric, which is then used as an input for the predictive metric. This separate metric can be created to match the metric definition for Revenue, but also define its level as Customer. This approach is better than adding a level directly to the Revenue metric itself because the Revenue metric may be used in other situations where the level should not be set to Customer. Such a metric would look like the following.
Prerequisite
This procedure assumes you are familiar with the process of creating a metric. For steps on how to create metrics, see Advanced Metrics.

To create level metrics to use as inputs for predictive metrics

1In the Metric Editor, open the metric that requires a level.
2Clear any Break-by parameters that may exist on the metric’s function:
aHighlight the function in the Definition pane to select it.
bRight-click the function and then select Function_Name parameters. The Parameters dialog box opens.
cOn the Break By tab, click Reset.
dClick OK to close the dialog box.
3Add the necessary attributes as metric levels:
aClick Level (Dimensionality) on the Metric component pane.
bIn the Object Browser, double-click each attribute to add as a level.
4If the predictive metric is to be used to forecast values for elements that do not exist in your project, you must define the join type for the metric used as an input for the predictive metric to be an outer join. For example, the predictive metric is planned to forecast values for one year in the future. Since this future year is not represented in the project, you must define the outer join type for the metric used as an input for the predictive metric so that values are returned.
To enable outer joins to include all data:
aSelect Metric Join Type from the Tools menu. The Metric Join Type dialog box opens.
bClear the Use default inherited value check box.
cSelect Outer.
dClick OK to close the dialog box.
5If you plan to export predictive metric results to a third-party tool, you should define the column alias for the metric used as an input for the predictive metric. This ensures that the name of the metric used as an input for the predictive metric can be viewed when viewing the exported results in the third-party tool.
To create a metric column alias to ensure the column name matches the metric’s name:
aSelect Advanced Settings from the Tools menu, and then select Metric Column Options. The Metric Column Alias Options dialog box opens.
bIn the Column Name field, type the alias.
cClick OK to close the dialog box.
6Save the metric with the alias name from the previous step. You can now include the metric in a training metric to create a predictive metric, as described in Creating a predictive model using MicroStrategy.

Conditional metrics as inputs for predictive metrics

To group a metric’s results by an attribute, create a conditional metric for each category. For example, you want to use customer revenue grouped by payment method in your data mining analysis. If you place the Customer attribute on the rows of the report, the Revenue metric on the columns, and the Payment Method attribute on the columns, you get the following report as a result:
However, this report presents problems if it is used as a dataset report because multiple headings are generated for all the columns, specifically, Revenue and each Payment Method. Additionally, each column is revenue for a particular payment method and unless there is a metric that matches this definition, it is difficult to successfully deploy any model that uses one of these columns.
To solve this problem, create a separate metric, which is then used as an input for a predictive metric, that filters Revenue for each Payment Method. This has the same definition as the original Revenue metric, but its conditionality is set to filter Revenue by a particular Payment Type.
Prerequisite
This procedure assumes you are familiar with the process of creating metrics and filters. For steps on how to create metrics, see Advanced Metrics. For steps on how to create filters, see Advanced Filters: Filtering Data on Reports.

To create a conditional predictive metric

1Create a separate filter for each of the necessary attribute elements. For the example above, they are Payment Method = Visa, Payment Method = Amex, Payment Method = Check, and so on.
2For each metric, create a separate metric to be used as an input for a predictive metric, as explained in the section above.
3Add the filters you created as conditions of the metric-based predictive input metric. Save the metric. You can now include the metric in a training metric to create a predictive metric, as described in Creating a predictive model using MicroStrategy.
The following report uses conditional metrics to generate the same results as the first report but in a dataset report format.

Comments

Post a Comment

Popular posts from this blog

Case functions Microstrategy

Ca se functions Microstrategy Case functions return specified data in a SQL query based on the evaluation of user-defined conditions. In general, a user specifies a list of conditions and corresponding return values. Case This function evaluates multiple expressions until a condition is determined to be true, then returns a corresponding value. If all conditions are false, a default value is returned.  Case  can be used for categorizing data based on multiple conditions. This is a single-value function. Syntax Case ( Condition1 ,  ReturnValue1 ,  Condition2 , ReturnValue2 ,...,  DefaultValue ) Example Case(([Total Revenue] < 300000), 0, ([Total Revenue] < 600000), 1, 2) sum(Case (Day@DESC in (“Sat”,”Sun”), Sales, 0) {~+} Sum(Case(Category@DESC In("Books","Electronics"),Revenue,0)){~+} CaseV (case vector) CaseV  evaluates a single metric and returns different values according to the results. It can be used to perfo...

Control the display of null and zero metric values

Show   Control the display of null and zero metric values in a grid report You can determine how to display or hide rows and columns in a grid report that consist only of null or zero metric values. You can have MicroStrategy hide the rows and columns in the following ways: Hide rows and columns that consist only of null metric values Hide rows and columns that consist only of zero metric values Hide rows and columns that consist only of null or zero metric values (default) Once you have defined how MicroStrategy hides null and zero metric values in the grid, you can quickly show or hide the grid using the Hide Nulls/Zeros option in the Data menu, as described below, or by clicking the  Hide Nulls/Zeros  icon  in the Data toolbar. To determine how null and zero metric values are displayed or hidden in a grid report Open the report in Edit mode. From the  Tools  menu, select  Report Options . The Report Options...

Microstrategy Report Pre and Post Statements

Microstrategy Report Pre and Post Statements Report Post Statement The Report Post Statement settings insert custom SQL statements after the final SELECT statement but before the DROP statements. The settings are numbered 1-5. Each text string entered in Report Post Statement 1 through Report Post Statement 4 is executed separately as a single statement. To execute more than five statements, insert multiple statements in Report Post Statement 5, separating each statement with a semicolon (;). The SQL Engine breaks them into individual statements at the semicolons and executes each separately. The custom SQL is applied to every intermediate table or view. Report Pre Statement The Report Pre Statement settings insert custom SQL statements at the beginning of the report SQL. The settings are numbered 1-5. Each text string entered in Report Pre Statement 1 through Report Pre Statement 4 is executed separately as a single statement. To execute more than five statem...

MicroStrategy URL API Parameters

MicroStrategy URL Structure The following table summarizes the root URL structure used for every request to MicroStrategy Web. Environment Main Application URL Administration URL J2EE http://webserver/MicroStrategy/servlet/mstrWeb http://webserver/MicroStrategy/servlet/mstrWebAdmin .NET http://webserver/MicroStrategy/asp/Main.aspx http://webserver/MicroStrategy/asp/Admin.aspx Every request sent to MicroStrategy Web calls a central controller. Parameters are appended to  Main.aspx  or  mstrWeb  (in a .NET and J2EE environment, respectively) to indicate to the controller how the request should be internally forwarded and handled. The following examples show a URL for accessing a MicroStrategy folder when the user does not have an existing session. The URL contains not only the parameters needed to connect to MicroStrategy Web, but also the parameters needed to log on and create a session. J2EE environment: <a href="http:...

Disable data blending in MicroStrategy

Disable data blending in MicroStrategy Starting in MicroStrategy 9.4 data blending was made available for documents and dashboards. This permits grid, graph and widget objects to source data from multiple different datasets at the same time.  This is available under the analytical engine VLDB properties inside of project configuration. The property is named "document grids from multiple datasets" and defaults to enabled but can be set to disabled.  Below are the steps to enable/disable the settings of data blending: 1. Go to project configuration by right clicking on specific project(You need admin rights to do this). 2. In the Project configuration windows as shown below select Configure under Project level VLDB settings section. 3. Now it will open the VLDB settings window, select + on " Analytical Engine Settings " and then click on " Document Grids from multiple datasets " option. You will be presented with two...

Multi-Table Data Import(MTDI) from one or more supported data sources

Multi-Table Data Import(MTDI) from one or more supported data sources In MicroStrategy Analytics Enterprise Web 10 onewards, users can now simultaneously import two or more tables from one or more supported data sources, this feature is called Multi-Table Data Import (MTDI) which has been renamed as Super Cubes in MSTR 2019 (Does it sound like multisourcing for all the users without admin help?) Currently, all connectors in MicroStrategy Web 10 except " OLAP " and " Search Engine Indices " support Multi-Table Data Import. Users are able to add multiple tables/files when doing data import from single connector, as shown below: Users are also able to combine multiple tables/files from different sources and store them into one single Intelligent Cube, as shown below:

The logical table size calculation in Microstrategy

The logical table size calculation in Microstrategy The logical table size is an integer number that represents the granularity or level of aggregation of a particular table. It is called 'logical' because it is not related to the physical size of the tables (number of rows). It is calculated according to the attribute IDs that are present in the table and their level in the system hierarchy.   Even though, the number does not reveal the actual number of rows in the table, it is an accurate way of measuring a table size without having to access its contents.   MicroStrategy Engine utilizes an algorithm based on attribute keys to calculate the Logical Table Size (LTS):   Given the following tables:     The algorithm that calculates the table sizes performs the following steps: Calculate the number of levels per hierarchy: Hierarchy 1: 3 Hierarchy 2: 4 Calculate each attribute individual weight according to the level in the hierarchy (level in hierarchy/number of ...

Update the data on an Intelligent Cube without having to republish the entire cube in MicroStrategy

Update the data on an Intelligent Cube without having to republish the entire cube in MicroStrategy MicroStrategy has introduced a feature known as, Incremental Refresh Options, which allow Intelligent Cubes to be updated based on one or more attributes, by setting up incremental refresh settings to update the Intelligent Cube with only new data. This can reduce the time and system resources necessary to update the Intelligent Cube periodically versus a full republish. For example, if a user has an Intelligent Cube that contains weekly sales data, the user may want this Intelligent Cube to be updated at the end of every week with the sales data for that week. By setting up incremental refresh settings, he can make it so that only data for one week is added to the Intelligent Cube, without affecting the existing data and without having to reload all existing data. Users can select t...

Apply or Pass-through functions in Microstrategy

Ap ply (Pass-Through) functions MSTR Apply functions provide access to functions or syntactic constructs that are not standard in MicroStrategy but are provided by various RDBMS systems.. Syntax common to Apply functions Apply Function Name   ("expression with placeholders", Arg1, Arg2, Arg3, …ArgN) where: Apply Function Name  – is a generic name used for the predefined pass-through functions described above expression with placeholders  – is the string describing the actual expression or syntax that the engine uses while generating the SQL and which is sent to the RDBMS. The placeholders are represented by #0, #1, and so on. "#" is a reserved character for MicroStrategy. Arg  – is an argument that replaces the parameter markers in the pattern. Arg1 replaces #0, Arg2 replaces #1, and so on. There are   five  pre-defined Apply functions to replace regular, predefined functions of the same type. For more details, cli...

Components of the MicroStrategy Engine

Components of the MicroStrategy Engine The MicroStrategy Engine consists of three engines:  • SQL Engine  • Query Engine  • Analytical Engine  These individual engines work together to fulfill report requests submitted by MicroStrategy that can be resolved by pure SQL alone.  The SQL Engine is responsible for generating optimized SQL and producing result sets that can be resolved by pure SQL alone. The Query Engine is responsible for executing the SQL generated by the SQL Engine.  The Analytical Engine is responsible for performing any calculation that cannot be resolved with SQL alone.