Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

feat(graphql): adds container aspect for dataflow and datajob entities #12236

Merged
Merged
Original file line number Diff line number Diff line change
Expand Up @@ -2377,6 +2377,17 @@
? dataJob.getDataPlatformInstance().getUrn()
: null;
}))
.dataFetcher(

Check warning on line 2380 in datahub-graphql-core/src/main/java/com/linkedin/datahub/graphql/GmsGraphQLEngine.java

View check run for this annotation

Codecov / codecov/patch

datahub-graphql-core/src/main/java/com/linkedin/datahub/graphql/GmsGraphQLEngine.java#L2380

Added line #L2380 was not covered by tests
"container",
new LoadableTypeResolver<>(
containerType,
(env) -> {
final DataJob dataJob = env.getSource();

Check warning on line 2385 in datahub-graphql-core/src/main/java/com/linkedin/datahub/graphql/GmsGraphQLEngine.java

View check run for this annotation

Codecov / codecov/patch

datahub-graphql-core/src/main/java/com/linkedin/datahub/graphql/GmsGraphQLEngine.java#L2385

Added line #L2385 was not covered by tests
return dataJob.getContainer() != null
? dataJob.getContainer().getUrn()
: null;

Check warning on line 2388 in datahub-graphql-core/src/main/java/com/linkedin/datahub/graphql/GmsGraphQLEngine.java

View check run for this annotation

Codecov / codecov/patch

datahub-graphql-core/src/main/java/com/linkedin/datahub/graphql/GmsGraphQLEngine.java#L2387-L2388

Added lines #L2387 - L2388 were not covered by tests
}))
.dataFetcher("parentContainers", new ParentContainersResolver(entityClient))

Check warning on line 2390 in datahub-graphql-core/src/main/java/com/linkedin/datahub/graphql/GmsGraphQLEngine.java

View check run for this annotation

Codecov / codecov/patch

datahub-graphql-core/src/main/java/com/linkedin/datahub/graphql/GmsGraphQLEngine.java#L2390

Added line #L2390 was not covered by tests
.dataFetcher("runs", new DataJobRunsResolver(entityClient))
.dataFetcher("privileges", new EntityPrivilegesResolver(entityClient))
.dataFetcher("exists", new EntityExistsResolver(entityService))
Expand Down Expand Up @@ -2454,6 +2465,17 @@
? dataFlow.getDataPlatformInstance().getUrn()
: null;
}))
.dataFetcher(

Check warning on line 2468 in datahub-graphql-core/src/main/java/com/linkedin/datahub/graphql/GmsGraphQLEngine.java

View check run for this annotation

Codecov / codecov/patch

datahub-graphql-core/src/main/java/com/linkedin/datahub/graphql/GmsGraphQLEngine.java#L2468

Added line #L2468 was not covered by tests
"container",
new LoadableTypeResolver<>(
containerType,
(env) -> {
final DataFlow dataFlow = env.getSource();

Check warning on line 2473 in datahub-graphql-core/src/main/java/com/linkedin/datahub/graphql/GmsGraphQLEngine.java

View check run for this annotation

Codecov / codecov/patch

datahub-graphql-core/src/main/java/com/linkedin/datahub/graphql/GmsGraphQLEngine.java#L2473

Added line #L2473 was not covered by tests
return dataFlow.getContainer() != null
? dataFlow.getContainer().getUrn()
: null;

Check warning on line 2476 in datahub-graphql-core/src/main/java/com/linkedin/datahub/graphql/GmsGraphQLEngine.java

View check run for this annotation

Codecov / codecov/patch

datahub-graphql-core/src/main/java/com/linkedin/datahub/graphql/GmsGraphQLEngine.java#L2475-L2476

Added lines #L2475 - L2476 were not covered by tests
}))
.dataFetcher("parentContainers", new ParentContainersResolver(entityClient))

Check warning on line 2478 in datahub-graphql-core/src/main/java/com/linkedin/datahub/graphql/GmsGraphQLEngine.java

View check run for this annotation

Codecov / codecov/patch

datahub-graphql-core/src/main/java/com/linkedin/datahub/graphql/GmsGraphQLEngine.java#L2478

Added line #L2478 was not covered by tests
.dataFetcher(
"health",
new EntityHealthResolver(
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -74,6 +74,7 @@ public class DataFlowType
DOMAINS_ASPECT_NAME,
DEPRECATION_ASPECT_NAME,
DATA_PLATFORM_INSTANCE_ASPECT_NAME,
CONTAINER_ASPECT_NAME,
DATA_PRODUCTS_ASPECT_NAME,
BROWSE_PATHS_V2_ASPECT_NAME,
STRUCTURED_PROPERTIES_ASPECT_NAME,
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -16,6 +16,7 @@
import com.linkedin.data.DataMap;
import com.linkedin.datahub.graphql.QueryContext;
import com.linkedin.datahub.graphql.authorization.AuthorizationUtils;
import com.linkedin.datahub.graphql.generated.Container;
import com.linkedin.datahub.graphql.generated.DataFlow;
import com.linkedin.datahub.graphql.generated.DataFlowEditableProperties;
import com.linkedin.datahub.graphql.generated.DataFlowInfo;
Expand Down Expand Up @@ -106,6 +107,7 @@ public DataFlow apply(
(dataset, dataMap) ->
dataset.setDataPlatformInstance(
DataPlatformInstanceAspectMapper.map(context, new DataPlatformInstance(dataMap))));
mappingHelper.mapToResult(context, CONTAINER_ASPECT_NAME, DataFlowMapper::mapContainers);
mappingHelper.mapToResult(
BROWSE_PATHS_V2_ASPECT_NAME,
(dataFlow, dataMap) ->
Expand Down Expand Up @@ -206,6 +208,17 @@ private static void mapGlobalTags(
dataFlow.setTags(globalTags);
}

private static void mapContainers(
@Nullable final QueryContext context, @Nonnull DataFlow dataFlow, @Nonnull DataMap dataMap) {
final com.linkedin.container.Container gmsContainer =
new com.linkedin.container.Container(dataMap);
dataFlow.setContainer(
Container.builder()
.setType(EntityType.CONTAINER)
.setUrn(gmsContainer.getContainer().toString())
.build());
}

private static void mapDomains(
@Nullable final QueryContext context, @Nonnull DataFlow dataFlow, @Nonnull DataMap dataMap) {
final Domains domains = new Domains(dataMap);
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -75,6 +75,7 @@ public class DataJobType
DOMAINS_ASPECT_NAME,
DEPRECATION_ASPECT_NAME,
DATA_PLATFORM_INSTANCE_ASPECT_NAME,
CONTAINER_ASPECT_NAME,
DATA_PRODUCTS_ASPECT_NAME,
BROWSE_PATHS_V2_ASPECT_NAME,
SUB_TYPES_ASPECT_NAME,
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -9,6 +9,7 @@
import com.linkedin.data.DataMap;
import com.linkedin.datahub.graphql.QueryContext;
import com.linkedin.datahub.graphql.authorization.AuthorizationUtils;
import com.linkedin.datahub.graphql.generated.Container;
import com.linkedin.datahub.graphql.generated.DataFlow;
import com.linkedin.datahub.graphql.generated.DataJob;
import com.linkedin.datahub.graphql.generated.DataJobEditableProperties;
Expand Down Expand Up @@ -112,6 +113,14 @@ public DataJob apply(
} else if (DATA_PLATFORM_INSTANCE_ASPECT_NAME.equals(name)) {
result.setDataPlatformInstance(
DataPlatformInstanceAspectMapper.map(context, new DataPlatformInstance(data)));
} else if (CONTAINER_ASPECT_NAME.equals(name)) {
final com.linkedin.container.Container gmsContainer =
new com.linkedin.container.Container(data);
result.setContainer(
Container.builder()
.setType(EntityType.CONTAINER)
.setUrn(gmsContainer.getContainer().toString())
.build());
} else if (BROWSE_PATHS_V2_ASPECT_NAME.equals(name)) {
result.setBrowsePathV2(BrowsePathsV2Mapper.map(context, new BrowsePathsV2(data)));
} else if (SUB_TYPES_ASPECT_NAME.equals(name)) {
Expand Down
20 changes: 20 additions & 0 deletions datahub-graphql-core/src/main/resources/entity.graphql
Original file line number Diff line number Diff line change
Expand Up @@ -6275,6 +6275,16 @@ type DataFlow implements EntityWithRelationships & Entity & BrowsableEntity {
"""
dataPlatformInstance: DataPlatformInstance

"""
The parent container in which the entity resides
"""
container: Container

"""
Recursively get the lineage of containers for this entity
"""
parentContainers: ParentContainersResult

"""
Granular API for querying edges extending from this entity
"""
Expand Down Expand Up @@ -6457,6 +6467,16 @@ type DataJob implements EntityWithRelationships & Entity & BrowsableEntity {
"""
dataPlatformInstance: DataPlatformInstance

"""
The parent container in which the entity resides
"""
container: Container

"""
Recursively get the lineage of containers for this entity
"""
parentContainers: ParentContainersResult

"""
Additional read write properties associated with the Data Job
"""
Expand Down
Original file line number Diff line number Diff line change
@@ -0,0 +1,42 @@
package com.linkedin.datahub.graphql.types.dataflow.mappers;

import com.linkedin.common.urn.Urn;
import com.linkedin.datahub.graphql.generated.DataFlow;
import com.linkedin.entity.Aspect;
import com.linkedin.entity.EntityResponse;
import com.linkedin.entity.EnvelopedAspect;
import com.linkedin.entity.EnvelopedAspectMap;
import com.linkedin.metadata.Constants;
import java.net.URISyntaxException;
import java.util.HashMap;
import java.util.Map;
import org.testng.Assert;
import org.testng.annotations.Test;

public class DataFlowMapperTest {
private static final Urn TEST_DATA_FLOW_URN =
Urn.createFromTuple(Constants.DATA_FLOW_ENTITY_NAME, "dataflow1");
private static final Urn TEST_CONTAINER_URN =
Urn.createFromTuple(Constants.CONTAINER_ENTITY_NAME, "container1");

@Test
public void testMapDataFlowContainer() throws URISyntaxException {
com.linkedin.container.Container input = new com.linkedin.container.Container();
input.setContainer(TEST_CONTAINER_URN);

final Map<String, EnvelopedAspect> containerAspect = new HashMap<>();
containerAspect.put(
Constants.CONTAINER_ASPECT_NAME,
new com.linkedin.entity.EnvelopedAspect().setValue(new Aspect(input.data())));
final EntityResponse response =
new EntityResponse()
.setEntityName(Constants.DATA_FLOW_ENTITY_NAME)
.setUrn(TEST_DATA_FLOW_URN)
.setAspects(new EnvelopedAspectMap(containerAspect));

final DataFlow actual = DataFlowMapper.map(null, response);

Assert.assertEquals(actual.getUrn(), TEST_DATA_FLOW_URN.toString());
Assert.assertEquals(actual.getContainer().getUrn(), TEST_CONTAINER_URN.toString());
}
}
Original file line number Diff line number Diff line change
@@ -0,0 +1,42 @@
package com.linkedin.datahub.graphql.types.datajob.mappers;

import com.linkedin.common.urn.Urn;
import com.linkedin.datahub.graphql.generated.DataJob;
import com.linkedin.entity.Aspect;
import com.linkedin.entity.EntityResponse;
import com.linkedin.entity.EnvelopedAspect;
import com.linkedin.entity.EnvelopedAspectMap;
import com.linkedin.metadata.Constants;
import java.net.URISyntaxException;
import java.util.HashMap;
import java.util.Map;
import org.testng.Assert;
import org.testng.annotations.Test;

public class DataJobMapperTest {
private static final Urn TEST_DATA_JOB_URN =
Urn.createFromTuple(Constants.DATA_JOB_ENTITY_NAME, "datajob1");
private static final Urn TEST_CONTAINER_URN =
Urn.createFromTuple(Constants.CONTAINER_ENTITY_NAME, "container1");

@Test
public void testMapDataJobContainer() throws URISyntaxException {
com.linkedin.container.Container input = new com.linkedin.container.Container();
input.setContainer(TEST_CONTAINER_URN);

final Map<String, EnvelopedAspect> containerAspect = new HashMap<>();
containerAspect.put(
Constants.CONTAINER_ASPECT_NAME,
new com.linkedin.entity.EnvelopedAspect().setValue(new Aspect(input.data())));
final EntityResponse response =
new EntityResponse()
.setEntityName(Constants.DATA_JOB_ENTITY_NAME)
.setUrn(TEST_DATA_JOB_URN)
.setAspects(new EnvelopedAspectMap(containerAspect));

final DataJob actual = DataJobMapper.map(null, response);

Assert.assertEquals(actual.getUrn(), TEST_DATA_JOB_URN.toString());
Assert.assertEquals(actual.getContainer().getUrn(), TEST_CONTAINER_URN.toString());
}
}
3 changes: 3 additions & 0 deletions datahub-web-react/src/graphql/dataFlow.graphql
Original file line number Diff line number Diff line change
Expand Up @@ -50,6 +50,9 @@ fragment dataFlowFields on DataFlow {
dataPlatformInstance {
...dataPlatformInstanceFields
}
parentContainers {
...parentContainersFields
}
browsePathV2 {
...browsePathV2Fields
}
Expand Down
3 changes: 3 additions & 0 deletions datahub-web-react/src/graphql/fragments.graphql
Original file line number Diff line number Diff line change
Expand Up @@ -403,6 +403,9 @@ fragment dataJobFields on DataJob {
dataPlatformInstance {
...dataPlatformInstanceFields
}
parentContainers {
...parentContainersFields
}
privileges {
canEditLineage
}
Expand Down
1 change: 1 addition & 0 deletions docs/how/updating-datahub.md
Original file line number Diff line number Diff line change
Expand Up @@ -43,6 +43,7 @@ This file documents any backwards-incompatible changes in DataHub and assists pe
- OpenAPI Update: PIT Keep Alive parameter added to scroll. NOTE: This parameter requires the `pointInTimeCreationEnabled` feature flag to be enabled and the `elasticSearch.implementation` configuration to be `elasticsearch`. This feature is not supported for OpenSearch at this time and the parameter will not be respected without both of these set.
- OpenAPI Update 2: Previously there was an incorrectly marked parameter named `sort` on the generic list entities endpoint for v3. This parameter is deprecated and only supports a single string value while the documentation indicates it supports a list of strings. This documentation error has been fixed and the correct field, `sortCriteria`, is now documented which supports a list of strings.
- #12223: For dbt Cloud ingestion, the "View in dbt" link will point at the "Explore" page in the dbt Cloud UI. You can revert to the old behavior of linking to the dbt Cloud IDE by setting `external_url_mode: ide".
- #12236: Data flow and data job entities may additionally produce container aspect that will require a corresponding upgrade of server. Otherwise server can reject the aspect.

### Breaking Changes

Expand Down
2 changes: 2 additions & 0 deletions metadata-models/src/main/resources/entity-registry.yml
Original file line number Diff line number Diff line change
Expand Up @@ -70,6 +70,7 @@ entities:
- glossaryTerms
- institutionalMemory
- dataPlatformInstance
- container
- browsePathsV2
- structuredProperties
- forms
Expand All @@ -93,6 +94,7 @@ entities:
- glossaryTerms
- institutionalMemory
- dataPlatformInstance
- container
- browsePathsV2
- structuredProperties
- incidentsSummary
Expand Down
Loading