Skip to content

Commit

Permalink
[8.x] [Dataset Quality] Fix loading on dataset quality summary (#201757
Browse files Browse the repository at this point in the history
…) (#201937)

# Backport

This will backport the following commits from `main` to `8.x`:
- [[Dataset Quality] Fix loading on dataset quality summary
(#201757)](#201757)

<!--- Backport version: 9.4.3 -->

### Questions ?
Please refer to the [Backport tool
documentation](https://github.com/sqren/backport)

<!--BACKPORT [{"author":{"name":"Marco Antonio
Ghiani","email":"marcoantonio.ghiani01@gmail.com"},"sourceCommit":{"committedDate":"2024-11-27T10:39:23Z","message":"[Dataset
Quality] Fix loading on dataset quality summary (#201757)\n\n## 📓
Summary\r\n\r\nCloses #186549 \r\n\r\nRefreshing the page didn't give
enough time for the data ingestion to\r\nprocess correctly, hopefully
navigating to the page, which is gated by\r\nan assertion on loading
indicators, should check that data is loaded\r\ncorrectly.\r\n\r\nAlso,
running the test in isolation raised a bug in the code, where
the\r\nloading of the summary details was not synced with the loaded
data from\r\nthe table. This resulted in the summary always being 0 at
first load,\r\nand then immediately update once the table is ready,
which broke the\r\ntest. Syncing their loading indicators fixed the
issue and tests passes\r\nin isolation
too.\r\n\r\n---------\r\n\r\nCo-authored-by: Marco Antonio Ghiani
<marcoantonio.ghiani@elastic.co>","sha":"8814e430e757dd6b3ec1ffec6eaa394e16d36cbd","branchLabelMapping":{"^v9.0.0$":"main","^v8.18.0$":"8.x","^v(\\d+).(\\d+).\\d+$":"$1.$2"}},"sourcePullRequest":{"labels":["release_note:skip","v9.0.0","backport:prev-minor","ci:project-deploy-observability","Team:obs-ux-logs"],"title":"[Dataset
Quality] Fix loading on dataset quality
summary","number":201757,"url":"https://github.com/elastic/kibana/pull/201757","mergeCommit":{"message":"[Dataset
Quality] Fix loading on dataset quality summary (#201757)\n\n## 📓
Summary\r\n\r\nCloses #186549 \r\n\r\nRefreshing the page didn't give
enough time for the data ingestion to\r\nprocess correctly, hopefully
navigating to the page, which is gated by\r\nan assertion on loading
indicators, should check that data is loaded\r\ncorrectly.\r\n\r\nAlso,
running the test in isolation raised a bug in the code, where
the\r\nloading of the summary details was not synced with the loaded
data from\r\nthe table. This resulted in the summary always being 0 at
first load,\r\nand then immediately update once the table is ready,
which broke the\r\ntest. Syncing their loading indicators fixed the
issue and tests passes\r\nin isolation
too.\r\n\r\n---------\r\n\r\nCo-authored-by: Marco Antonio Ghiani
<marcoantonio.ghiani@elastic.co>","sha":"8814e430e757dd6b3ec1ffec6eaa394e16d36cbd"}},"sourceBranch":"main","suggestedTargetBranches":[],"targetPullRequestStates":[{"branch":"main","label":"v9.0.0","branchLabelMappingKey":"^v9.0.0$","isSourceBranch":true,"state":"MERGED","url":"https://github.com/elastic/kibana/pull/201757","number":201757,"mergeCommit":{"message":"[Dataset
Quality] Fix loading on dataset quality summary (#201757)\n\n## 📓
Summary\r\n\r\nCloses #186549 \r\n\r\nRefreshing the page didn't give
enough time for the data ingestion to\r\nprocess correctly, hopefully
navigating to the page, which is gated by\r\nan assertion on loading
indicators, should check that data is loaded\r\ncorrectly.\r\n\r\nAlso,
running the test in isolation raised a bug in the code, where
the\r\nloading of the summary details was not synced with the loaded
data from\r\nthe table. This resulted in the summary always being 0 at
first load,\r\nand then immediately update once the table is ready,
which broke the\r\ntest. Syncing their loading indicators fixed the
issue and tests passes\r\nin isolation
too.\r\n\r\n---------\r\n\r\nCo-authored-by: Marco Antonio Ghiani
<marcoantonio.ghiani@elastic.co>","sha":"8814e430e757dd6b3ec1ffec6eaa394e16d36cbd"}}]}]
BACKPORT-->

Co-authored-by: Marco Antonio Ghiani <marcoantonio.ghiani01@gmail.com>
  • Loading branch information
kibanamachine and tonyghiani authored Nov 27, 2024
1 parent d3e2734 commit 7b21abd
Show file tree
Hide file tree
Showing 4 changed files with 43 additions and 14 deletions.
Original file line number Diff line number Diff line change
Expand Up @@ -14,8 +14,12 @@ import { filterInactiveDatasets } from '../utils';

const useSummaryPanel = () => {
const { service } = useDatasetQualityContext();
const { filteredItems, canUserMonitorDataset, canUserMonitorAnyDataStream, loading } =
useDatasetQualityTable();
const {
filteredItems,
canUserMonitorDataset,
canUserMonitorAnyDataStream,
loading: isTableLoading,
} = useDatasetQualityTable();

const { timeRange } = useSelector(service, (state) => state.context.filters);

Expand All @@ -27,9 +31,10 @@ const useSummaryPanel = () => {
percentages: filteredItems.map((item) => item.degradedDocs.percentage),
};

const isDatasetsQualityLoading = useSelector(service, (state) =>
const isDegradedDocsLoading = useSelector(service, (state) =>
state.matches('stats.degradedDocs.fetching')
);
const isDatasetsQualityLoading = isDegradedDocsLoading || isTableLoading;

/*
User Authorization
Expand All @@ -38,7 +43,7 @@ const useSummaryPanel = () => {
(item) => item.userPrivileges?.canMonitor ?? true
);

const isUserAuthorizedForDataset = !loading
const isUserAuthorizedForDataset = !isTableLoading
? canUserMonitorDataset && canUserMonitorAnyDataStream && canUserMonitorAllFilteredDataStreams
: true;

Expand Down
30 changes: 28 additions & 2 deletions x-pack/test/functional/apps/dataset_quality/config.ts
Original file line number Diff line number Diff line change
Expand Up @@ -9,7 +9,33 @@ import { FtrConfigProviderContext, GenericFtrProviderContext } from '@kbn/test';
import { createLogger, LogLevel, LogsSynthtraceEsClient } from '@kbn/apm-synthtrace';
import { FtrProviderContext } from '../../ftr_provider_context';

export default async function createTestConfig({ readConfigFile }: FtrConfigProviderContext) {
import { FtrProviderContext as InheritedFtrProviderContext } from '../../ftr_provider_context';

export type InheritedServices = InheritedFtrProviderContext extends GenericFtrProviderContext<
infer TServices,
{}
>
? TServices
: {};

export type InheritedPageObjects = InheritedFtrProviderContext extends GenericFtrProviderContext<
infer TServices,
infer TPageObjects
>
? TPageObjects
: {};

interface DatasetQualityConfig {
services: InheritedServices & {
logSynthtraceEsClient: (
context: InheritedFtrProviderContext
) => Promise<LogsSynthtraceEsClient>;
};
}

export default async function createTestConfig({
readConfigFile,
}: FtrConfigProviderContext): Promise<DatasetQualityConfig> {
const functionalConfig = await readConfigFile(require.resolve('../../config.base.js'));
const services = functionalConfig.get('services');
const pageObjects = functionalConfig.get('pageObjects');
Expand All @@ -34,7 +60,7 @@ export default async function createTestConfig({ readConfigFile }: FtrConfigProv
export type CreateTestConfig = Awaited<ReturnType<typeof createTestConfig>>;

export type DatasetQualityServices = CreateTestConfig['services'];
export type DatasetQualityPageObject = CreateTestConfig['pageObjects'];
export type DatasetQualityPageObject = InheritedPageObjects;

export type DatasetQualityFtrProviderContext = GenericFtrProviderContext<
DatasetQualityServices,
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -423,7 +423,7 @@ export default function ({ getService, getPageObjects }: DatasetQualityFtrProvid
);
});

countColumn.sort('ascending');
await countColumn.sort('ascending');

await retry.tryForTime(5000, async () => {
const currentUrl = await browser.getCurrentUrl();
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -19,7 +19,7 @@ export default function ({ getService, getPageObjects }: DatasetQualityFtrProvid
const synthtrace = getService('logSynthtraceEsClient');
const to = '2024-01-01T12:00:00.000Z';

const ingestDataForSummary = async () => {
const ingestDataForSummary = () => {
// Ingest documents for 3 type of datasets
return synthtrace.index([
// Ingest good data to all 3 datasets
Expand Down Expand Up @@ -48,16 +48,14 @@ export default function ({ getService, getPageObjects }: DatasetQualityFtrProvid
};

describe('Dataset quality summary', () => {
before(async () => {
await synthtrace.index(getInitialTestLogs({ to, count: 4 }));
await PageObjects.datasetQuality.navigateTo();
});

afterEach(async () => {
await synthtrace.clean();
});

it('shows poor, degraded and good count as 0 and all dataset as healthy', async () => {
await synthtrace.index(getInitialTestLogs({ to, count: 4 }));
await PageObjects.datasetQuality.navigateTo();

const summary = await PageObjects.datasetQuality.parseSummaryPanel();
expect(summary).to.eql({
datasetHealthPoor: '0',
Expand All @@ -70,7 +68,7 @@ export default function ({ getService, getPageObjects }: DatasetQualityFtrProvid

it('shows updated count for poor, degraded and good datasets, estimated size and updates active datasets', async () => {
await ingestDataForSummary();
await PageObjects.datasetQuality.refreshTable();
await PageObjects.datasetQuality.navigateTo();

const summary = await PageObjects.datasetQuality.parseSummaryPanel();
const { estimatedData, ...restOfSummary } = summary;
Expand Down

0 comments on commit 7b21abd

Please sign in to comment.