diff --git a/sdk/vision/ai-vision-image-analysis-rest/README.md b/sdk/vision/ai-vision-image-analysis-rest/README.md index 5591c6a4eb2c..e67ba7c57c37 100644 --- a/sdk/vision/ai-vision-image-analysis-rest/README.md +++ b/sdk/vision/ai-vision-image-analysis-rest/README.md @@ -3,12 +3,13 @@ The Image Analysis service provides AI algorithms for processing images and returning information about their content. In a single service call, you can extract one or more visual features from the image simultaneously, including getting a caption for the image, extracting text shown in the image (OCR) and detecting objects. For more information on the service and the supported visual features, see [Image Analysis overview][image_analysis_overview], and the [Concepts][image_analysis_concepts] page. Use the Image Analysis client library to: -* Authenticate against the service -* Set what features you would like to extract -* Upload an image for analysis, or send an image URL -* Get the analysis result -[Product documentation][image_analysis_overview] +- Authenticate against the service +- Set what features you would like to extract +- Upload an image for analysis, or send an image URL +- Get the analysis result + +[Product documentation][image_analysis_overview] | [Samples](https://aka.ms/azsdk/image-analysis/samples/js) | [Vision Studio][vision_studio] | [API reference documentation](https://aka.ms/azsdk/image-analysis/ref-docs/js) @@ -28,9 +29,9 @@ See our [support policy](https://github.com/Azure/azure-sdk-for-js/blob/main/SUP - An [Azure subscription](https://azure.microsoft.com/free). - A [Computer Vision resource](https://portal.azure.com/#create/Microsoft.CognitiveServicesComputerVision) in your Azure subscription. - * You will need the key and endpoint from this resource to authenticate against the service. - * You can use the free pricing tier (`F0`) to try the service, and upgrade later to a paid tier for production. - * Note that in order to run Image Analysis with the `Caption` or `Dense Captions` features, the Azure resource needs to be from one of the following GPU-supported regions: `East US`, `France Central`, `Korea Central`, `North Europe`, `Southeast Asia`, `West Europe`, or `West US`. + - You will need the key and endpoint from this resource to authenticate against the service. + - You can use the free pricing tier (`F0`) to try the service, and upgrade later to a paid tier for production. + - Note that in order to run Image Analysis with the `Caption` or `Dense Captions` features, the Azure resource needs to be from one of the following GPU-supported regions: `East US`, `France Central`, `Korea Central`, `North Europe`, `Southeast Asia`, `West Europe`, or `West US`. ### Install the `@azure-rest/ai-vision-image-analysis` package @@ -64,9 +65,9 @@ For more information about these features, see [Image Analysis overview][image_a Image Analysis works on images that meet the following requirements: -* The image must be presented in JPEG, PNG, GIF, BMP, WEBP, ICO, TIFF, or MPO format -* The file size of the image must be less than 20 megabytes (MB) -* The dimensions of the image must be greater than 50 x 50 pixels and less than 16,000 x 16,000 pixels +- The image must be presented in JPEG, PNG, GIF, BMP, WEBP, ICO, TIFF, or MPO format +- The file size of the image must be less than 20 megabytes (MB) +- The dimensions of the image must be greater than 50 x 50 pixels and less than 16,000 x 16,000 pixels ### ImageAnalysisClient @@ -78,26 +79,20 @@ The `ImageAnalysisClient` is the primary interface for developers interacting wi Here's an example of how to create an `ImageAnalysisClient` instance using a key-based authentication. - -```javascript Snippet:const endpoint = ""; -const key = ""; -const credential = new AzureKeyCredential(key); - -const client = new ImageAnalysisClient(endpoint, credential); - -const { ImageAnalysisClient } = require("@azure-rest/ai-vision-image-analysis"); -const { AzureKeyCredential } = require('@azure/core-auth'); +```ts snippet:ReadmeSampleCreateClient_KeyCredential +import { AzureKeyCredential } from "@azure/core-auth"; +import ImageAnalysisClient from "@azure-rest/ai-vision-image-analysis"; const endpoint = ""; const key = ""; const credential = new AzureKeyCredential(key); - -const client = new ImageAnalysisClient(endpoint, credential); +const client = ImageAnalysisClient(endpoint, credential); ``` #### Create ImageAnalysisClient with a Microsoft Entra ID Credential **Prerequisites for Entra ID Authentication**: + - The role `Cognitive Services User` assigned to you. Role assignment can be done via the "Access Control (IAM)" tab of your Computer Vision resource in the Azure portal. - [Azure CLI](https://learn.microsoft.com/cli/azure/install-azure-cli) installed. - You are logged into your Azure account by running `az login`. @@ -110,92 +105,199 @@ Client subscription key authentication is used in most of the examples in this g npm install @azure/identity ``` -```javascript Snippet:ImageAnalysisEntraIDAuth +```ts snippet:ReadmeSampleCreateClient_DefaultAzureCredential +import { DefaultAzureCredential } from "@azure/identity"; +import ImageAnalysisClient from "@azure-rest/ai-vision-image-analysis"; + const endpoint = ""; const credential = new DefaultAzureCredential(); - -const client = new ImageAnalysisClient(endpoint, credential); +const client = ImageAnalysisClient(endpoint, credential); ``` + ### Analyze an image from URL The following example demonstrates how to analyze an image using the Image Analysis client library for JavaScript. -```javascript Snippet:ImageAnalysisFromUrl +```ts snippet:ReadmeSampleAnalyzeImageFromUrl +import { DefaultAzureCredential } from "@azure/identity"; +import ImageAnalysisClient, { isUnexpected } from "@azure-rest/ai-vision-image-analysis"; + +const endpoint = ""; +const credential = new DefaultAzureCredential(); +const client = ImageAnalysisClient(endpoint, credential); + const imageUrl = "https://example.com/image.jpg"; const features = ["Caption", "DenseCaptions", "Objects", "People", "Read", "SmartCrops", "Tags"]; -async function analyzeImageFromUrl() { - const result = await client.path("/imageanalysis:analyze").post({ - body: { - url: imageUrl, - }, - queryParameters: { - features: features, - "smartCrops-aspect-ratios": [0.9, 1.33], - }, - contentType: "application/json", - }); +const result = await client.path("/imageanalysis:analyze").post({ + body: { + url: imageUrl, + }, + queryParameters: { + features: features, + "smartCrops-aspect-ratios": [0.9, 1.33], + }, + contentType: "application/json", +}); +if (isUnexpected(result)) { + throw result.body.error; +} + +console.log(`Model Version: ${result.body.modelVersion}`); +console.log(`Image Metadata: ${JSON.stringify(result.body.metadata)}`); + +if (result.body.captionResult) { + console.log( + `Caption: ${result.body.captionResult.text} (confidence: ${result.body.captionResult.confidence})`, + ); +} + +if (result.body.denseCaptionsResult) { + for (const denseCaption of result.body.denseCaptionsResult.values) { + console.log(`Dense Caption: ${JSON.stringify(denseCaption)}`); + } +} + +if (result.body.objectsResult) { + for (const object of result.body.objectsResult.values) { + console.log(`Object: ${JSON.stringify(object)}`); + } +} + +if (result.body.peopleResult) { + for (const person of result.body.peopleResult.values) { + console.log(`Person: ${JSON.stringify(person)}`); + } +} + +if (result.body.readResult) { + for (const block of result.body.readResult.blocks) { + console.log(`Text Block: ${JSON.stringify(block)}`); + } +} - console.log("Image analysis result:", result.body); +if (result.body.smartCropsResult) { + for (const smartCrop of result.body.smartCropsResult.values) { + console.log(`Smart Crop: ${JSON.stringify(smartCrop)}`); + } } -analyzeImageFromUrl(); +if (result.body.tagsResult) { + for (const tag of result.body.tagsResult.values) { + console.log(`Tag: ${JSON.stringify(tag)}`); + } +} ``` ### Analyze an image from a local file In this example, we will analyze an image from a local file using the Image Analysis client library for JavaScript. -```javascript Snippet:ImageAnalysisFromLocalFile -const fs = require("fs"); +```ts snippet:ReadmeSampleAnalyzeImageFromFile +import { DefaultAzureCredential } from "@azure/identity"; +import ImageAnalysisClient, { isUnexpected } from "@azure-rest/ai-vision-image-analysis"; +import { readFileSync } from "node:fs"; + +const endpoint = ""; +const credential = new DefaultAzureCredential(); +const client = ImageAnalysisClient(endpoint, credential); const imagePath = "./path/to/your/image.jpg"; const features = ["Caption", "DenseCaptions", "Objects", "People", "Read", "SmartCrops", "Tags"]; -async function analyzeImageFromFile() { - const imageBuffer = fs.readFileSync(imagePath); +const imageBuffer = readFileSync(imagePath); + +const result = await client.path("/imageanalysis:analyze").post({ + body: imageBuffer, + queryParameters: { + features: features, + "smartCrops-aspect-ratios": [0.9, 1.33], + }, + contentType: "application/octet-stream", +}); +if (isUnexpected(result)) { + throw result.body.error; +} + +console.log(`Model Version: ${result.body.modelVersion}`); +console.log(`Image Metadata: ${JSON.stringify(result.body.metadata)}`); - const result = await client.path("/imageanalysis:analyze").post({ - body: imageBuffer, - queryParameters: { - features: features, - "smartCrops-aspect-ratios": [0.9, 1.33], - }, - contentType: "application/octet-stream", - }); +if (result.body.captionResult) { + console.log( + `Caption: ${result.body.captionResult.text} (confidence: ${result.body.captionResult.confidence})`, + ); +} + +if (result.body.denseCaptionsResult) { + for (const denseCaption of result.body.denseCaptionsResult.values) { + console.log(`Dense Caption: ${JSON.stringify(denseCaption)}`); + } +} - console.log("Image analysis result:", result.body); +if (result.body.objectsResult) { + for (const object of result.body.objectsResult.values) { + console.log(`Object: ${JSON.stringify(object)}`); + } +} + +if (result.body.peopleResult) { + for (const person of result.body.peopleResult.values) { + console.log(`Person: ${JSON.stringify(person)}`); + } } -analyzeImageFromFile(); +if (result.body.readResult) { + for (const block of result.body.readResult.blocks) { + console.log(`Text Block: ${JSON.stringify(block)}`); + } +} + +if (result.body.smartCropsResult) { + for (const smartCrop of result.body.smartCropsResult.values) { + console.log(`Smart Crop: ${JSON.stringify(smartCrop)}`); + } +} + +if (result.body.tagsResult) { + for (const tag of result.body.tagsResult.values) { + console.log(`Tag: ${JSON.stringify(tag)}`); + } +} ``` ### Extract text from an image Url + This example demonstrates how to extract printed or hand-written text for the image file [sample.jpg](https://aka.ms/azsdk/image-analysis/sample.jpg) using the ImageAnalysisClient. The method call returns an ImageAnalysisResult object. The ReadResult property on the returned object includes a list of text lines and a bounding polygon surrounding each text line. For each line, it also returns a list of words in the text line and a bounding polygon surrounding each word. -``` javascript Snippet:readmeText -const client: ImageAnalysisClient = createImageAnalysisClient(endpoint, credential); -const features: string[] = [ - 'Read' -]; +```ts snippet:ReadmeSampleExtractTextFromImageUrl +import { DefaultAzureCredential } from "@azure/identity"; +import ImageAnalysisClient, { isUnexpected } from "@azure-rest/ai-vision-image-analysis"; + +const endpoint = ""; +const credential = new DefaultAzureCredential(); +const client = ImageAnalysisClient(endpoint, credential); -const imageUrl: string = 'https://aka.ms/azsdk/image-analysis/sample.jpg'; +const features: string[] = ["Read"]; +const imageUrl: string = "https://aka.ms/azsdk/image-analysis/sample.jpg"; -client.path('/imageanalysis:analyze').post({ +const result = await client.path("/imageanalysis:analyze").post({ body: { url: imageUrl }, queryParameters: { features: features }, - contentType: 'application/json' -}).then(result => { - const iaResult: ImageAnalysisResultOutput = result.body as ImageAnalysisResultOutput; - - // Process the response - if (iaResult.readResult && iaResult.readResult.blocks.length > 0) { - iaResult.readResult.blocks.forEach(block => { - console.log(`Detected text block: ${JSON.stringify(block)}`); - }); - } else { - console.log('No text blocks detected.'); + contentType: "application/json", +}); +if (isUnexpected(result)) { + throw result.body.error; +} + +// Process the response +const imageAnalysisResult = result.body; +if (imageAnalysisResult.readResult && imageAnalysisResult.readResult.blocks.length > 0) { + for (const block of imageAnalysisResult.readResult.blocks) { + console.log(`Detected text block: ${JSON.stringify(block)}`); } +} else { + console.log("No text blocks detected."); +} ``` ## Troubleshooting @@ -204,8 +306,8 @@ client.path('/imageanalysis:analyze').post({ Enabling logging may help uncover useful information about failures. In order to see a log of HTTP requests and responses, set the `AZURE_LOG_LEVEL` environment variable to `info`. Alternatively, logging can be enabled at runtime by calling `setLogLevel` in the `@azure/logger`: -```javascript -const { setLogLevel } = require("@azure/logger"); +```ts snippet:SetLogLevel +import { setLogLevel } from "@azure/logger"; setLogLevel("info"); ``` @@ -228,4 +330,4 @@ If you'd like to contribute to this library, please read the [contributing guide [image_analysis_concepts]: https://learn.microsoft.com/azure/ai-services/computer-vision/concept-tag-images-40 [vision_studio]: https://aka.ms/vision-studio/image-analysis [azure_identity]: https://learn.microsoft.com/javascript/api/overview/azure/identity-readme -[azure_identity_dac]: https://learn.microsoft.com/javascript/api/@azure/identity/defaultazurecredential \ No newline at end of file +[azure_identity_dac]: https://learn.microsoft.com/javascript/api/@azure/identity/defaultazurecredential diff --git a/sdk/vision/ai-vision-image-analysis-rest/package.json b/sdk/vision/ai-vision-image-analysis-rest/package.json index 210f624fc759..0c58a19d7429 100644 --- a/sdk/vision/ai-vision-image-analysis-rest/package.json +++ b/sdk/vision/ai-vision-image-analysis-rest/package.json @@ -54,14 +54,14 @@ "unit-test": "npm run unit-test:node && npm run unit-test:browser", "unit-test:browser": "npm run clean && dev-tool run build-package && dev-tool run build-test && dev-tool run test:vitest --browser", "unit-test:node": "dev-tool run test:vitest", - "update-snippets": "echo skipped" + "update-snippets": "dev-tool run update-snippets" }, "sideEffects": false, "autoPublish": false, "dependencies": { "@azure-rest/core-client": "^2.3.1", "@azure/core-auth": "^1.9.0", - "@azure/core-rest-pipeline": "^1.18.0", + "@azure/core-rest-pipeline": "^1.19.0", "@azure/logger": "^1.1.4", "tslib": "^2.8.1" }, @@ -71,16 +71,16 @@ "@azure-tools/test-utils-vitest": "^1.0.0", "@azure/dev-tool": "^1.0.0", "@azure/eslint-plugin-azure-sdk": "^3.0.0", - "@azure/identity": "^4.5.0", + "@azure/identity": "^4.7.0", "@types/node": "^18.0.0", - "@vitest/browser": "^3.0.3", - "@vitest/coverage-istanbul": "^3.0.3", + "@vitest/browser": "^3.0.6", + "@vitest/coverage-istanbul": "^3.0.6", "autorest": "latest", "dotenv": "^16.0.0", "eslint": "^9.9.0", - "playwright": "^1.49.0", + "playwright": "^1.50.1", "typescript": "~5.7.2", - "vitest": "^3.0.3" + "vitest": "^3.0.6" }, "//metadata": { "constantPaths": [ @@ -93,6 +93,7 @@ "browser": "./dist/browser/index.js", "type": "module", "tshy": { + "project": "./tsconfig.src.json", "exports": { "./package.json": "./package.json", ".": "./src/index.ts" @@ -105,8 +106,7 @@ "browser", "react-native" ], - "selfLink": false, - "project": "./tsconfig.src.json" + "selfLink": false }, "exports": { "./package.json": "./package.json", @@ -128,5 +128,6 @@ "default": "./dist/commonjs/index.js" } } - } + }, + "react-native": "./dist/react-native/index.js" } diff --git a/sdk/vision/ai-vision-image-analysis-rest/samples/typescript/analyzeImageFromLocalFile.ts b/sdk/vision/ai-vision-image-analysis-rest/samples/typescript/analyzeImageFromLocalFile.ts index 1e2701915810..f8d7324a5dfb 100644 --- a/sdk/vision/ai-vision-image-analysis-rest/samples/typescript/analyzeImageFromLocalFile.ts +++ b/sdk/vision/ai-vision-image-analysis-rest/samples/typescript/analyzeImageFromLocalFile.ts @@ -1,7 +1,7 @@ // Copyright (c) Microsoft Corporation. // Licensed under the MIT License. -import * as fs from 'fs'; +import * as fs from 'node:fs'; import createImageAnalysisClient, { DenseCaptionOutput, ImageAnalysisClient, @@ -14,9 +14,7 @@ import createImageAnalysisClient, { } from '@azure-rest/ai-vision-image-analysis'; import { AzureKeyCredential } from '@azure/core-auth'; // Load the .env file if it exists -import * as dotenv from "dotenv"; -dotenv.config(); - +import "dotenv/config"; const endpoint: string = process.env['VISION_ENDPOINT'] || ''; const key: string = process.env['VISION_KEY'] || ''; const credential = new AzureKeyCredential(key); diff --git a/sdk/vision/ai-vision-image-analysis-rest/samples/typescript/analyzeImageFromUrl.ts b/sdk/vision/ai-vision-image-analysis-rest/samples/typescript/analyzeImageFromUrl.ts index 128707c9eb08..a01102f8fde4 100644 --- a/sdk/vision/ai-vision-image-analysis-rest/samples/typescript/analyzeImageFromUrl.ts +++ b/sdk/vision/ai-vision-image-analysis-rest/samples/typescript/analyzeImageFromUrl.ts @@ -13,9 +13,7 @@ import createImageAnalysisClient, { } from '@azure-rest/ai-vision-image-analysis'; import { AzureKeyCredential } from '@azure/core-auth'; // Load the .env file if it exists -import * as dotenv from "dotenv"; -dotenv.config(); - +import "dotenv/config"; const endpoint: string = process.env['VISION_ENDPOINT'] || ''; const key: string = process.env['VISION_KEY'] || ''; const credential = new AzureKeyCredential(key); diff --git a/sdk/vision/ai-vision-image-analysis-rest/samples/typescript/caption.ts b/sdk/vision/ai-vision-image-analysis-rest/samples/typescript/caption.ts index c6f69f81a53c..d676ac221c31 100644 --- a/sdk/vision/ai-vision-image-analysis-rest/samples/typescript/caption.ts +++ b/sdk/vision/ai-vision-image-analysis-rest/samples/typescript/caption.ts @@ -4,9 +4,7 @@ import createImageAnalysisClient, { ImageAnalysisClient, isUnexpected } from '@azure-rest/ai-vision-image-analysis'; import { AzureKeyCredential } from '@azure/core-auth'; // Load the .env file if it exists -import * as dotenv from "dotenv"; -dotenv.config(); - +import "dotenv/config"; const endpoint: string = process.env['VISION_ENDPOINT'] || ''; const key: string = process.env['VISION_KEY'] || ''; const credential = new AzureKeyCredential(key); diff --git a/sdk/vision/ai-vision-image-analysis-rest/samples/typescript/denseCaptions.ts b/sdk/vision/ai-vision-image-analysis-rest/samples/typescript/denseCaptions.ts index d4b29dc5707f..049d5800df2a 100644 --- a/sdk/vision/ai-vision-image-analysis-rest/samples/typescript/denseCaptions.ts +++ b/sdk/vision/ai-vision-image-analysis-rest/samples/typescript/denseCaptions.ts @@ -4,9 +4,7 @@ import createImageAnalysisClient, { ImageAnalysisClient, isUnexpected } from '@azure-rest/ai-vision-image-analysis'; import { AzureKeyCredential } from '@azure/core-auth'; // Load the .env file if it exists -import * as dotenv from "dotenv"; -dotenv.config(); - +import "dotenv/config"; const endpoint: string = process.env['VISION_ENDPOINT'] || ''; const key: string = process.env['VISION_KEY'] || ''; const credential = new AzureKeyCredential(key); diff --git a/sdk/vision/ai-vision-image-analysis-rest/samples/typescript/objects.ts b/sdk/vision/ai-vision-image-analysis-rest/samples/typescript/objects.ts index 84d548367af1..6b2651ea7b7e 100644 --- a/sdk/vision/ai-vision-image-analysis-rest/samples/typescript/objects.ts +++ b/sdk/vision/ai-vision-image-analysis-rest/samples/typescript/objects.ts @@ -4,9 +4,7 @@ import createImageAnalysisClient, { ImageAnalysisClient, isUnexpected } from '@azure-rest/ai-vision-image-analysis'; import { AzureKeyCredential } from '@azure/core-auth'; // Load the .env file if it exists -import * as dotenv from "dotenv"; -dotenv.config(); - +import "dotenv/config"; const endpoint: string = process.env['VISION_ENDPOINT'] || ''; const key: string = process.env['VISION_KEY'] || ''; const credential = new AzureKeyCredential(key); diff --git a/sdk/vision/ai-vision-image-analysis-rest/samples/typescript/peopleResult.ts b/sdk/vision/ai-vision-image-analysis-rest/samples/typescript/peopleResult.ts index cc159f2efd08..e94eafd29d35 100644 --- a/sdk/vision/ai-vision-image-analysis-rest/samples/typescript/peopleResult.ts +++ b/sdk/vision/ai-vision-image-analysis-rest/samples/typescript/peopleResult.ts @@ -1,12 +1,10 @@ // Copyright (c) Microsoft Corporation. // Licensed under the MIT License. -import createImageAnalysisClient, { ImageAnalysisClient, DetectedPersonOutput, isUnexpected } from '@azure-rest/ai-vision-image-analysis'; +import createImageAnalysisClient, { ImageAnalysisClient, isUnexpected } from '@azure-rest/ai-vision-image-analysis'; import { AzureKeyCredential } from '@azure/core-auth'; // Load the .env file if it exists -import * as dotenv from "dotenv"; -dotenv.config(); - +import "dotenv/config"; const endpoint: string = process.env['VISION_ENDPOINT'] || ''; const key: string = process.env['VISION_KEY'] || ''; const credential = new AzureKeyCredential(key); diff --git a/sdk/vision/ai-vision-image-analysis-rest/samples/typescript/read.ts b/sdk/vision/ai-vision-image-analysis-rest/samples/typescript/read.ts index a3346739187a..e641cc12c396 100644 --- a/sdk/vision/ai-vision-image-analysis-rest/samples/typescript/read.ts +++ b/sdk/vision/ai-vision-image-analysis-rest/samples/typescript/read.ts @@ -4,9 +4,7 @@ import createImageAnalysisClient, { ImageAnalysisClient, isUnexpected } from '@azure-rest/ai-vision-image-analysis'; import { AzureKeyCredential } from '@azure/core-auth'; // Load the .env file if it exists -import * as dotenv from "dotenv"; -dotenv.config(); - +import "dotenv/config"; const endpoint: string = process.env['VISION_ENDPOINT'] || ''; const key: string = process.env['VISION_KEY'] || ''; const credential = new AzureKeyCredential(key); diff --git a/sdk/vision/ai-vision-image-analysis-rest/samples/typescript/smartCropsResult.ts b/sdk/vision/ai-vision-image-analysis-rest/samples/typescript/smartCropsResult.ts index 1db10c9860cf..90aeb84bd8da 100644 --- a/sdk/vision/ai-vision-image-analysis-rest/samples/typescript/smartCropsResult.ts +++ b/sdk/vision/ai-vision-image-analysis-rest/samples/typescript/smartCropsResult.ts @@ -4,9 +4,7 @@ import createImageAnalysisClient, { ImageAnalysisClient, isUnexpected } from '@azure-rest/ai-vision-image-analysis'; import { AzureKeyCredential } from '@azure/core-auth'; // Load the .env file if it exists -import * as dotenv from "dotenv"; -dotenv.config(); - +import "dotenv/config"; const endpoint: string = process.env['VISION_ENDPOINT'] || ''; const key: string = process.env['VISION_KEY'] || ''; const credential = new AzureKeyCredential(key); diff --git a/sdk/vision/ai-vision-image-analysis-rest/samples/typescript/tags.ts b/sdk/vision/ai-vision-image-analysis-rest/samples/typescript/tags.ts index 02a499f1a576..de64c76d64df 100644 --- a/sdk/vision/ai-vision-image-analysis-rest/samples/typescript/tags.ts +++ b/sdk/vision/ai-vision-image-analysis-rest/samples/typescript/tags.ts @@ -4,9 +4,7 @@ import createImageAnalysisClient, { ImageAnalysisClient, isUnexpected } from '@azure-rest/ai-vision-image-analysis'; import { AzureKeyCredential } from '@azure/core-auth'; // Load the .env file if it exists -import * as dotenv from "dotenv"; -dotenv.config(); - +import "dotenv/config"; const endpoint: string = process.env['VISION_ENDPOINT'] || ''; const key: string = process.env['VISION_KEY'] || ''; const credential = new AzureKeyCredential(key); diff --git a/sdk/vision/ai-vision-image-analysis-rest/test/public/AnalysisTests.spec.ts b/sdk/vision/ai-vision-image-analysis-rest/test/public/AnalysisTests.spec.ts index 9de93286948b..6b54209b5470 100644 --- a/sdk/vision/ai-vision-image-analysis-rest/test/public/AnalysisTests.spec.ts +++ b/sdk/vision/ai-vision-image-analysis-rest/test/public/AnalysisTests.spec.ts @@ -54,7 +54,7 @@ describe("Analyze Tests", () => { return new Uint8Array(buffer); } - it("Analyze from URL", async function () { + it("Analyze from URL", async () => { const allFeatures: string[] = [ "Caption", "DenseCaptions", @@ -90,7 +90,7 @@ describe("Analyze Tests", () => { } }); - it("Analyze from Stream", async function () { + it("Analyze from Stream", async () => { const allFeatures: string[] = [ "Caption", "DenseCaptions", diff --git a/sdk/vision/ai-vision-image-analysis-rest/test/public/utils/env.ts b/sdk/vision/ai-vision-image-analysis-rest/test/public/utils/env.ts index 866412f4082d..54ee1e71af77 100644 --- a/sdk/vision/ai-vision-image-analysis-rest/test/public/utils/env.ts +++ b/sdk/vision/ai-vision-image-analysis-rest/test/public/utils/env.ts @@ -1,6 +1,4 @@ // Copyright (c) Microsoft Corporation. // Licensed under the MIT License. -import * as dotenv from "dotenv"; - -dotenv.config(); +import "dotenv/config"; diff --git a/sdk/vision/ai-vision-image-analysis-rest/test/snippets.spec.ts b/sdk/vision/ai-vision-image-analysis-rest/test/snippets.spec.ts new file mode 100644 index 000000000000..02c371d8f068 --- /dev/null +++ b/sdk/vision/ai-vision-image-analysis-rest/test/snippets.spec.ts @@ -0,0 +1,208 @@ +// Copyright (c) Microsoft Corporation. +// Licensed under the MIT License. + +import { AzureKeyCredential } from "@azure/core-auth"; +import ImageAnalysisClient, { isUnexpected } from "../src/index.js"; +import { setLogLevel } from "@azure/logger"; +import { describe, it } from "vitest"; +import { DefaultAzureCredential } from "@azure/identity"; +import { readFileSync } from "node:fs"; + +describe("snippets", () => { + it("ReadmeSampleCreateClient_KeyCredential", async () => { + const endpoint = ""; + const key = ""; + const credential = new AzureKeyCredential(key); + const client = ImageAnalysisClient(endpoint, credential); + }); + + it("ReadmeSampleCreateClient_DefaultAzureCredential", async () => { + const endpoint = ""; + const credential = new DefaultAzureCredential(); + const client = ImageAnalysisClient(endpoint, credential); + }); + + it("ReadmeSampleAnalyzeImageFromUrl", async () => { + const endpoint = ""; + const credential = new DefaultAzureCredential(); + const client = ImageAnalysisClient(endpoint, credential); + // @ts-preserve-whitespace + const imageUrl = "https://example.com/image.jpg"; + const features = [ + "Caption", + "DenseCaptions", + "Objects", + "People", + "Read", + "SmartCrops", + "Tags", + ]; + // @ts-preserve-whitespace + const result = await client.path("/imageanalysis:analyze").post({ + body: { + url: imageUrl, + }, + queryParameters: { + features: features, + "smartCrops-aspect-ratios": [0.9, 1.33], + }, + contentType: "application/json", + }); + if (isUnexpected(result)) { + throw result.body.error; + } + // @ts-preserve-whitespace + console.log(`Model Version: ${result.body.modelVersion}`); + console.log(`Image Metadata: ${JSON.stringify(result.body.metadata)}`); + // @ts-preserve-whitespace + if (result.body.captionResult) { + console.log( + `Caption: ${result.body.captionResult.text} (confidence: ${result.body.captionResult.confidence})`, + ); + } + // @ts-preserve-whitespace + if (result.body.denseCaptionsResult) { + for (const denseCaption of result.body.denseCaptionsResult.values) { + console.log(`Dense Caption: ${JSON.stringify(denseCaption)}`); + } + } + // @ts-preserve-whitespace + if (result.body.objectsResult) { + for (const object of result.body.objectsResult.values) { + console.log(`Object: ${JSON.stringify(object)}`); + } + } + // @ts-preserve-whitespace + if (result.body.peopleResult) { + for (const person of result.body.peopleResult.values) { + console.log(`Person: ${JSON.stringify(person)}`); + } + } + // @ts-preserve-whitespace + if (result.body.readResult) { + for (const block of result.body.readResult.blocks) { + console.log(`Text Block: ${JSON.stringify(block)}`); + } + } + // @ts-preserve-whitespace + if (result.body.smartCropsResult) { + for (const smartCrop of result.body.smartCropsResult.values) { + console.log(`Smart Crop: ${JSON.stringify(smartCrop)}`); + } + } + // @ts-preserve-whitespace + if (result.body.tagsResult) { + for (const tag of result.body.tagsResult.values) { + console.log(`Tag: ${JSON.stringify(tag)}`); + } + } + }); + + it("ReadmeSampleAnalyzeImageFromFile", async () => { + const endpoint = ""; + const credential = new DefaultAzureCredential(); + const client = ImageAnalysisClient(endpoint, credential); + // @ts-preserve-whitespace + const imagePath = "./path/to/your/image.jpg"; + const features = [ + "Caption", + "DenseCaptions", + "Objects", + "People", + "Read", + "SmartCrops", + "Tags", + ]; + // @ts-preserve-whitespace + const imageBuffer = readFileSync(imagePath); + // @ts-preserve-whitespace + const result = await client.path("/imageanalysis:analyze").post({ + body: imageBuffer, + queryParameters: { + features: features, + "smartCrops-aspect-ratios": [0.9, 1.33], + }, + contentType: "application/octet-stream", + }); + if (isUnexpected(result)) { + throw result.body.error; + } + // @ts-preserve-whitespace + console.log(`Model Version: ${result.body.modelVersion}`); + console.log(`Image Metadata: ${JSON.stringify(result.body.metadata)}`); + // @ts-preserve-whitespace + if (result.body.captionResult) { + console.log( + `Caption: ${result.body.captionResult.text} (confidence: ${result.body.captionResult.confidence})`, + ); + } + // @ts-preserve-whitespace + if (result.body.denseCaptionsResult) { + for (const denseCaption of result.body.denseCaptionsResult.values) { + console.log(`Dense Caption: ${JSON.stringify(denseCaption)}`); + } + } + // @ts-preserve-whitespace + if (result.body.objectsResult) { + for (const object of result.body.objectsResult.values) { + console.log(`Object: ${JSON.stringify(object)}`); + } + } + // @ts-preserve-whitespace + if (result.body.peopleResult) { + for (const person of result.body.peopleResult.values) { + console.log(`Person: ${JSON.stringify(person)}`); + } + } + // @ts-preserve-whitespace + if (result.body.readResult) { + for (const block of result.body.readResult.blocks) { + console.log(`Text Block: ${JSON.stringify(block)}`); + } + } + // @ts-preserve-whitespace + if (result.body.smartCropsResult) { + for (const smartCrop of result.body.smartCropsResult.values) { + console.log(`Smart Crop: ${JSON.stringify(smartCrop)}`); + } + } + // @ts-preserve-whitespace + if (result.body.tagsResult) { + for (const tag of result.body.tagsResult.values) { + console.log(`Tag: ${JSON.stringify(tag)}`); + } + } + }); + + it("ReadmeSampleExtractTextFromImageUrl", async () => { + const endpoint = ""; + const credential = new DefaultAzureCredential(); + const client = ImageAnalysisClient(endpoint, credential); + // @ts-preserve-whitespace + const features: string[] = ["Read"]; + const imageUrl: string = "https://aka.ms/azsdk/image-analysis/sample.jpg"; + // @ts-preserve-whitespace + const result = await client.path("/imageanalysis:analyze").post({ + body: { url: imageUrl }, + queryParameters: { features: features }, + contentType: "application/json", + }); + if (isUnexpected(result)) { + throw result.body.error; + } + // @ts-preserve-whitespace + // Process the response + const imageAnalysisResult = result.body; + if (imageAnalysisResult.readResult && imageAnalysisResult.readResult.blocks.length > 0) { + for (const block of imageAnalysisResult.readResult.blocks) { + console.log(`Detected text block: ${JSON.stringify(block)}`); + } + } else { + console.log("No text blocks detected."); + } + }); + + it("SetLogLevel", async () => { + setLogLevel("info"); + }); +}); diff --git a/sdk/vision/ai-vision-image-analysis-rest/tsconfig.json b/sdk/vision/ai-vision-image-analysis-rest/tsconfig.json index 273d9078a24a..19ceb382b521 100644 --- a/sdk/vision/ai-vision-image-analysis-rest/tsconfig.json +++ b/sdk/vision/ai-vision-image-analysis-rest/tsconfig.json @@ -1,7 +1,13 @@ { "references": [ - { "path": "./tsconfig.src.json" }, - { "path": "./tsconfig.samples.json" }, - { "path": "./tsconfig.test.json" } + { + "path": "./tsconfig.src.json" + }, + { + "path": "./tsconfig.samples.json" + }, + { + "path": "./tsconfig.test.json" + } ] } diff --git a/sdk/vision/ai-vision-image-analysis-rest/vitest.browser.config.ts b/sdk/vision/ai-vision-image-analysis-rest/vitest.browser.config.ts index 50ec2d5489b0..edf54e73547e 100644 --- a/sdk/vision/ai-vision-image-analysis-rest/vitest.browser.config.ts +++ b/sdk/vision/ai-vision-image-analysis-rest/vitest.browser.config.ts @@ -9,8 +9,9 @@ export default mergeConfig( defineConfig({ test: { include: ["dist-test/browser/test/**/*.spec.js"], - hookTimeout: 5000000, - testTimeout: 5000000, + exclude: ["dist-test/browser/test/snippets.spec.js"], + testTimeout: 1200000, + hookTimeout: 1200000, }, }), ); diff --git a/sdk/vision/ai-vision-image-analysis-rest/vitest.config.ts b/sdk/vision/ai-vision-image-analysis-rest/vitest.config.ts index 8849d7428190..86a71911ccc2 100644 --- a/sdk/vision/ai-vision-image-analysis-rest/vitest.config.ts +++ b/sdk/vision/ai-vision-image-analysis-rest/vitest.config.ts @@ -8,8 +8,8 @@ export default mergeConfig( viteConfig, defineConfig({ test: { - hookTimeout: 5000000, - testTimeout: 5000000, + testTimeout: 1200000, + hookTimeout: 1200000, }, }), ); diff --git a/sdk/vision/ai-vision-image-analysis-rest/vitest.esm.config.ts b/sdk/vision/ai-vision-image-analysis-rest/vitest.esm.config.ts index 2f6e757a54f7..5e9735e9b144 100644 --- a/sdk/vision/ai-vision-image-analysis-rest/vitest.esm.config.ts +++ b/sdk/vision/ai-vision-image-analysis-rest/vitest.esm.config.ts @@ -5,7 +5,4 @@ import { mergeConfig } from "vitest/config"; import vitestConfig from "./vitest.config.ts"; import vitestEsmConfig from "../../../vitest.esm.shared.config.ts"; -export default mergeConfig( - vitestConfig, - vitestEsmConfig -); +export default mergeConfig(vitestConfig, vitestEsmConfig);