This document outlines a thorough, step-by-step process for analyzing marketing risk data with a focus on data lineage and transparency. The guide will provide you with instructions on how data is sourced, processed, and validated, ensuring that it reaches its optimal state for analytical purposes. The process involves automated workflows, data normalization, validation, and transformation, ultimately leading to the selection of a golden copy for use. We will also explore the ways in which users can interact with this data for deeper insights and analysis.
Understand the importance of lineage and transparency in marketing risk data. This involves knowing the data's origin and the transformations it undergoes to reach its final state.

Normalize the data and apply validation rules, derivations, or transformations. Capture all these processes to make the information accessible to end users.

Conduct a demonstration focusing on accessing reference data such as curve data and individual tenor points. This includes detailed information about issuers, classifications, or ratings.

Review information about contributors, including liquidity metrics and validation lineages. Lastly, examine time series and analytical datasets for further insights.

Create a trustworthy time series dataset to enhance transparency and lineage. Initiate the process by setting up your screen for analysis.

Examine an interest rate curve representation for a specific date. Automated workflows have compiled this golden curve from various sources, ensuring smooth consistency.

Each line on the curve represents a vendor, with one selected as the golden curve. Utilize the visual tools available to investigate the data further with ease.

Quickly examine the visual lineage of the data and explore individual tenors in detail.

Select a tenor point to view the extensive data captured, including price types, quote types, and associated commentary or changes.

Access extensive data, including actual values and the rules applied, from the origin point to the vendor file.

View the status and results of rules applied to the data, including deviations and thresholds. Explore liquidity measures such as bid-offer spreads and contributor numbers.

Gain visibility into the trustworthiness of the market data. Review the contributors and vendors involved in creating the golden copy.

Rank vendors according to the rules applied to select the golden rule.

Access various options by right-clicking on an individual point, including exception resolution methods and contributor analysis for visual scaling.

Identify the golden candidate and its placement on the visual scale within the screen interface.

Ensure lineage and audit information capture manual and system-generated changes, derivations, and store them in the database.

Export the audit report and parameterize it by specific dates. A pre-downloaded file can be used for reference.

Review the summary screen in an Excel spreadsheet for a specific date, detailing system and user activities by asset types.

Drill down into details for manual changes and metadata for maker and checker users, including associated commentary.

Access underlying reference data through right-click, navigating to the curve's definition for detailed insights.

Explore the construction details of a curve, such as CHF curve, with its underlying index and specific points.

Navigate to specific tenor points to view terms, conditions, calendars, and other related information.

Utilize this data for downstream systems, aiding in processes like bootstrapping and surface smoothing.

Review the daily market data process and how changes are recorded, exploring deeper into the time series dataset.

Access the time series screen to view CHF IRS curve details and past data over a decade, with a color-coded Excel grid view.

Investigate additional price details, including status and data source, for a thorough analysis.

Capture changes in price and details of failed validations within the system's data columns.

Visualize and correlate data points, performing ad hoc checks and leveraging analytical functions.

Utilize system-generated checks or parameterize them, conduct standard deviation or outlier checks as needed.

Choose from various analytical functions, like percent return, to analyze liquidity horizon impacts.

View generated series and plot them on a chart using the chosen analytical function for insights on data changes.

Analyze the system-generated data for time series shock impacts over the curve, identifying outliers and data consistency.

Implement advanced calibrations using Python integration for analysis like PCA and other visual features.

Plot data points to calculate correlations, highlighting any outliers for better understanding of the dataset.

Customize Python scripts to extend analysis to other curves, ensuring comprehensive data handling.

Explore additional analyses like PCA or bootstrapping with available tools for more in-depth evaluation.

Conclude the agenda and open the floor for any questions or further clarifications needed on the process.
