diff --git a/src/connections/auto-instrumentation/kotlin-setup.md b/src/connections/auto-instrumentation/kotlin-setup.md index b7fa63241a..541cb52a7b 100644 --- a/src/connections/auto-instrumentation/kotlin-setup.md +++ b/src/connections/auto-instrumentation/kotlin-setup.md @@ -17,167 +17,180 @@ This guide shows how to install and configure the library, as well as how to ena To use Signals with Android, you need: - An active Segment workspace with Auto-Instrumentation enabled. -- A Kotlin-based Android project. - Android Gradle Plugin version 7.0 or later. - A minimum compile SDK version of 21. Signals supports [Jetpack Compose](https://developer.android.com/compose){:target="_blank"} and traditional Android UI frameworks. It also includes optional plugins for network tracking using [OkHttp3](https://square.github.io/okhttp/){:target="_blank"}, [Retrofit](https://square.github.io/retrofit/){:target="_blank"}, or [HttpURLConnection](https://developer.android.com/reference/java/net/HttpURLConnection){:target="_blank"}. -Segment recommends testing in a development environment before deploying Signals in production. For more information, see [Debug mode](#step-4-enable-debug-mode). +Segment recommends testing in a development environment before deploying Signals in production. For more information, see [Debug mode](#step-3-enable-debug-mode). -## Step 1: Install dependencies +## Prerequisites -To install Signals, add the following dependencies to your app-level Gradle build file. +Auto-Instrumentation (also known as Signals) works on top of Analytics and Live Plugins. Make sure to add the following dependencies to your module's Gradle file if you don't have them already. ```groovy -dependencies { - // Core Analytics Kotlin library - implementation("com.segment.analytics.kotlin:android:1.19.1") - - // Live plugin for real-time analytics - implementation("com.segment.analytics.kotlin:analytics-kotlin-live:1.1.0") - - // Signals core library - implementation("com.segment.analytics.kotlin.signals:core:0.5.0") - - // Optional: Jetpack Compose UI tracking - implementation("com.segment.analytics.kotlin.signals:compose:0.5.0") - - // Optional: OkHttp3 network request tracking - implementation("com.segment.analytics.kotlin.signals:okhttp3:0.5.0") - - // Optional: Screen and route tracking for Navigation components - implementation("com.segment.analytics.kotlin.signals:navigation:0.5.0") - - // Optional: HttpURLConnection tracking - implementation("com.segment.analytics.kotlin.signals:java-net:0.5.0") -} +// analytics kotlin +implementation ("com.segment.analytics.kotlin:android:1.22.0") +// live plugin +implementation("com.segment.analytics.kotlin:analytics-kotlin-live:1.3.0") ``` -The core libraries are required to enable Signals and real-time analytics. Use the following optional plugins to track additional activity based on your app's architecture: - -- **Compose**: Tracks user interface events in Jetpack Compose. -- **OkHttp3**: Captures requests sent through OkHttp3 or Retrofit. -- **Navigation**: Tracks route changes when using Jetpack Navigation. -- **JavaNet**: Tracks network activity sent through `HttpURLConnection`. - -Only add the plugins you plan to use. You can add or remove them later without reinitializing your source. - -## Step 2: Initialize the SDK - -After you add dependencies, you need to initialize the Analytics client and configure the Signals plugin. - -Start by creating the `Analytics` instance using your source's write key. Then add the Signals plugin and configure its settings separately. - -```kotlin -// Create the Analytics instance with your configuration -val analytics = Analytics(Configuration(writeKey = "")) - -// Add the live plugin for real-time event handling -analytics.add(LivePlugins()) +## Step 1: Getting started -// Add the Signals plugin -analytics.add(Signals) - -// Configure Signals settings -Signals.configuration = Configuration( - maximumBufferSize = 1000, // Number of signals to keep in memory - broadcastInterval = 60, // Send signals every 60 seconds - broadcasters = listOf(WebhookBroadcaster("YOUR_WEBHOOK")), // Optional - debugMode = true // For development use only -) - -// Optional: Add the Compose plugin to track UI events and interactions -analytics.add(SignalsComposeTrackingPlugin()) +To get started: +1. Add Signals Core: + ```groovy + // signal core + implementation ("com.segment.analytics.kotlin.signals:core:1.0.0") + ``` +2. Initialize Signals. For a complete list, see [configuration options](#configuration-options). + ```kotlin + //... .... + analytics.add(LivePlugins()) // Make sure LivePlugins is added + analytics.add(Signals) // Add the signals plugin + + Signals.configuration = Configuration( + // sendDebugSignalsToSegment will relay events to Segment server. Should only be true for development purposes. + sendDebugSignalsToSegment = true + // obfuscateDebugSignals will obfuscate sensitive data + obfuscateDebugSignals = true + // .. other options + ) + ``` +3. Add proper dependency and plugin as needed to: + * [Capture interactions](#capture-interactions). + * [Capture navigation](#capture-navigation). + * [Capture network](#capture-network). -// Optional: Track screen transitions using Navigation -analytics.add(SignalsActivityTrackingPlugin()) -navController.turnOnScreenTracking() -``` +## Step 2: Additional setup -When you run this code, keep the following in mind: +### Capture interactions -- You need to replace `` with the key from your Android Source in Segment. -- `debugMode` sends signals to Segment for use in the Event Builder. Only enable it in development environments. -- If your app doesn't use Jetpack Compose or Navigation, you can skip those plugin lines. +#### Kotlin Compose -For more options, see [Configuration options reference](#configuration-options). +1. Add the dependency to your module’s Gradle build file: + ```groovy + implementation ("com.segment.analytics.kotlin.signals:compose:1.0.0") + ``` -## Step 3: Track network requests +2. Add `SignalsComposeTrackingPlugin` to analytics: + ```kotlin + analytics.add(SignalsComposeTrackingPlugin()) + ``` -Signals supports automatic tracking of network activity for apps that use OkHttp3, Retrofit, or `HttpURLConnection`. +#### Legacy XML UI -Add the relevant plugin based on your network stack. +1. Add the uitoolkit Gradle Plugin dependency to project-level `build.gradle`: + ```groovy + buildscript { + dependencies { + classpath 'com.segment.analytics.kotlin.signals:uitoolkit-gradle-plugin:1.0.0' + } + } + ``` +2. Apply the plugin in your app-level `build.gradle` and add the dependency: + ```groovy + plugins { + // ...other plugins + id 'com.segment.analytics.kotlin.signals.uitoolkit-tracking' + } + + dependencies { + // ..other dependencies + implementation ("com.segment.analytics.kotlin.signals:uitoolkit:1.0.0") + } + ``` -### OkHttp3 -1. Add the dependency to your Gradle file: +### Capture navigation +1. Add the navigation Gradle Plugin dependency to project-level `build.gradle`: ```groovy - implementation("com.segment.analytics.kotlin.signals:okhttp3:0.5.0") + buildscript { + dependencies { + classpath 'com.segment.analytics.kotlin.signals:navigation-gradle-plugin:1.0.0' + } + } ``` - -2. Add the tracking plugin to your `OkHttpClient`: - +2. Apply the plugin in your app-level `build.gradle` and add the dependency: + ```groovy + plugins { + // ...other plugins + id 'com.segment.analytics.kotlin.signals.navigation-tracking' + } + + dependencies { + // ..other dependencies + implementation ("com.segment.analytics.kotlin.signals:navigation:1.0.0") + } + ``` +3. (**Optional**): Add `SignalsActivityTrackingPlugin` to analytics to track Activity/Fragment navigation. **This is not required for Compose Navigation**. ```kotlin - val okHttpClient = OkHttpClient.Builder() - .addInterceptor(SignalsOkHttp3TrackingPlugin()) - .build() + analytics.add(SignalsActivityTrackingPlugin()) ``` -### Retrofit - -Retrofit is built on top of OkHttp, so the setup is similar. - -1. Add the same OkHttp3 plugin shown in the previous sectiion: +### Capture network +#### OkHttp + +1. Add the dependency: ```groovy - implementation("com.segment.analytics.kotlin.signals:okhttp3:0.5.0") + implementation ("com.segment.analytics.kotlin.signals:okhttp3:1.0.0") ``` -2. Attach the plugin through your Retrofit client configuration: - +2. Add `SignalsOkHttp3TrackingPlugin` as an interceptor to your OkHttpClient: ```kotlin - val okHttpClient = OkHttpClient.Builder() - .addInterceptor(SignalsOkHttp3TrackingPlugin()) - .build() - - val retrofit = Retrofit.Builder() - .client(okHttpClient) - .baseUrl("https://your.api.endpoint") - .build() + private val okHttpClient = OkHttpClient.Builder() + .addInterceptor(SignalsOkHttp3TrackingPlugin()) + .build() ``` -### HttpURLConnection - -1. Add the JavaNet plugin dependency: +#### Retrofit +1. Add the dependency: ```groovy - implementation("com.segment.analytics.kotlin.signals:java-net:0.5.0") + implementation ("com.segment.analytics.kotlin.signals:okhttp3:1.0.0") ``` -2. Install the plugin at runtime: - +2. Add `SignalsOkHttp3TrackingPlugin` as an interceptor to your Retrofit client: ```kotlin - JavaNetTrackingPlugin.install() + private val okHttpClient = OkHttpClient.Builder() + .addInterceptor(SignalsOkHttp3TrackingPlugin()) + .build() + + val retrofit = Retrofit.Builder() + .client(okHttpClient) + .build() ``` -Depending on your app’s network stack, you may only need one plugin. If your app uses multiple clients, you can install more than one plugin. +#### java.net.HttpURLConnection + 1. Add the dependency: + ```groovy + implementation ("com.segment.analytics.kotlin.signals:java-net:1.0.0") + ``` + + 2. Install the `JavaNetTrackingPlugin` on where you initialize analytics: + ```kotlin + JavaNetTrackingPlugin.install() + ``` + -## Step 4: Enable debug mode +## Step 3: Enable debug mode By default, Signals stores captured data on the device and doesn't forward it to Segment. This process prevents unnecessary bandwidth use and helps support privacy compliance requirements. -To view captured signals in the Event Builder and create event generation rules, you need to enable `debugMode`. This setting temporarily lets the SDK send signal data to Segment while you're testing. +To view captured signals in the Event Builder and create event generation rules, enable `sendDebugSignalsToSegment`. This setting temporarily lets the SDK send signal data to Segment while you're testing. + +In addition, the SDK obfuscates signals sent to Segment by default. To view the completed data, you need to turn off `obfuscateDebugSignals`. > warning "" -> Only enable `debugMode` in development environments. Avoid using `debugMode` in production apps. +> Only enable `sendDebugSignalsToSegment` in development environments. Avoid using `sendDebugSignalsToSegment` in production apps. -You can enable `debugMode` in one of two ways. +You can enable `sendDebugSignalsToSegment` and turn off `obfuscateDebugSignals` in one of two ways. ### Option 1: Use build flavors -Configure `debugMode` at build time using [Android product flavors](https://developer.android.com/build/build-variants#product-flavors){:target="_blank"}. +Configure `sendDebugSignalsToSegment` and `obfuscateDebugSignals` at build time using [Android product flavors](https://developer.android.com/build/build-variants#product-flavors){:target="_blank"}. 1. In your `build.gradle` file, define two flavors: @@ -186,10 +199,12 @@ Configure `debugMode` at build time using [Android product flavors](https://deve ... productFlavors { prod { - buildConfigField "boolean", "DEBUG_MODE", "false" + buildConfigField "boolean", "SEND_DEBUG_SIGNALS_TO_SEGMENT", "false" + buildConfigField "boolean", "OBFUSCATE_DEBUG_SIGNALS", "true" } dev { - buildConfigField "boolean", "DEBUG_MODE", "true" + buildConfigField "boolean", "SEND_DEBUG_SIGNALS_TO_SEGMENT", "true" + buildConfigField "boolean", "OBFUSCATE_DEBUG_SIGNALS", "false" } } } @@ -199,40 +214,50 @@ Configure `debugMode` at build time using [Android product flavors](https://deve ```kotlin Signals.configuration = Configuration( - ... - debugMode = BuildConfig.DEBUG_MODE + // ... other config options + sendDebugSignalsToSegment = BuildConfig.SEND_DEBUG_SIGNALS_TO_SEGMENT + obfuscateDebugSignals = BuildConfig.OBFUSCATE_DEBUG_SIGNALS ) ``` ### Option 2: Use a feature flag -If your app uses [Firebase Remote Config](https://firebase.google.com/docs/remote-config){:target="_blank"} or a similar system, you can control `debugMode` remotely. +If your app uses [Firebase Remote Config](https://firebase.google.com/docs/remote-config){:target="_blank"} or a similar system, you can control `sendDebugSignalsToSegment` and `obfuscateDebugSignals` remotely. ```kotlin Signals.configuration = Configuration( ... - debugMode = remoteConfig.getBoolean("debug_mode") + sendDebugSignalsToSegment = remoteConfig.getBoolean("sendDebugSignalsToSegment") + obfuscateDebugSignals = remoteConfig.getBoolean("obfuscateDebugSignals") ) ``` +## Step 4: Turn on Auto-Instrumentation in your source + +Next, return to the source settings to turn on Auto-Instrumentation: + +1. Go to **Connections > Sources**. +2. Select the source you used in [Step 1](#step-1-getting-started). +3. From the source's overview tab, go to **Settings > Advanced**. +4. Toggle Auto-Instrumention on. + ## Step 5: Verify event collection After you build and run your app, use the [Event Builder](/docs/connections/auto-instrumentation/event-builder/) to confirm that Signals are being collected correctly. 1. In your Segment workspace, go to **Connections > Sources** and select the Android Source you configured. 2. Open the **Event Builder** tab. -3. Interact with your app on a simulator or test device: - - Navigate between screens. - - Tap buttons and UI elements. - - Trigger network requests. - -If `debugMode` is enabled, Signals appear in real time as you interact with the app. - +3. Interact with your app on a simulator or test device: + > - Navigate between screens. + > - Tap buttons and UI elements. + > - Trigger network requests. + > + > If `sendDebugSignalsToSegment` is enabled, Signals appear in real time as you interact with the app. 4. In the Event Builder, select a signal and click **Configure event** to define a new event. 5. After you add any event mappings, click **Publish event rules** to save them. > info "What if I don't see the Event Builder tab?" -> If you don't see the Event Builder tab, confirm that the SDK is installed correctly and make sure `debugMode` is enabled. Verify that Auto-Instrumentation is enabled in **Settings > Advanced**. If you still don't see it, reach out to your CSM. +> If you don't see the Event Builder tab, confirm that the SDK is installed correctly and make sure `sendDebugSignalsToSegment` is enabled. Verify that Auto-Instrumentation is enabled in **Settings > Advanced**. If you still don't see it, reach out to your CSM. ## Configuration options @@ -240,12 +265,14 @@ Use the `Signals.configuration` object to control how captured signals are store The following table lists the available options: -| Option | Required | Type | Default | Description | -| ------------------- | -------- | ------------------------- | ------- | --------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------- | -| `maximumBufferSize` | No | `Int` | `1000` | The number of captured signals to keep in memory before relaying them. Signals get stored in a first-in, first-out buffer. | -| `broadcastInterval` | No | `Int` (seconds) | `60` | The interval, in seconds, at which buffered signals are sent to broadcasters. | -| `broadcasters` | No | `List` | N/A | A list of broadcasters that forward signal data to external destinations. `SegmentBroadcaster` is included by default, and you can add others like `WebhookBroadcaster` or a custom implementation. | -| `debugMode` | No | `Boolean` | `false` | When `true`, relays signals to Segment so they appear in the Event Builder. Only enable this in development environments. | +| OPTION | REQUIRED | VALUE | DESCRIPTION | +|------------------|----------|---------------------------|-------------| +| **maximumBufferSize** | No | Integer | The number of signals to be kept for JavaScript inspection. This buffer is first-in, first-out. Default is **1000**. | +| **relayCount** | No | Integer | Relays every X signals to Segment. Default is **20**. | +| **relayInterval** | No | Integer | Relays signals to Segment every X seconds. Default is **60**. | +| **broadcasters** | No | List\ | An array of broadcasters. These objects forward signal data to their destinations, like **WebhookBroadcaster**, or you could write your own **DebugBroadcaster** that writes logs to the developer console. **SegmentBroadcaster** is always added by the SDK. | +| **sendDebugSignalsToSegment** | No | Boolean | Turns on debug mode and allows the SDK to relay Signals to Segment server. Default is **false**. It should only be set to true for development purposes. | +| **obfuscateDebugSignals** | No | Boolean | Obfuscates signals being relayed to Segment. Default is **true**. | ## Next steps diff --git a/src/connections/auto-instrumentation/swift-setup.md b/src/connections/auto-instrumentation/swift-setup.md index 7133f052d8..953efab2f6 100644 --- a/src/connections/auto-instrumentation/swift-setup.md +++ b/src/connections/auto-instrumentation/swift-setup.md @@ -12,7 +12,7 @@ Learn how to connect an existing source, integrate dependencies, turn on Auto-In > info "Regional availability" > Auto-Instrumentation isn't supported in EU workspaces. -## Step 1: Get your source write key +## Before you start You need the `writeKey` from an existing Segment source. To find it: @@ -21,69 +21,270 @@ You need the `writeKey` from an existing Segment source. To find it: 3. From the source's overview tab, go to **Settings > API Keys**. 4. Copy the `writeKey` shown in the code block. -## Step 2: Add dependencies and initialization code +Segment recommends testing in a development environment before deploying Signals in production. For more information, see [Debug mode](#step-3-enable-debug-mode). -Next, add the Signals SDKs to your Swift applicatiion. +## Prerequisites -1. Use Swift Package Manager to add the Signals SDK from the following repository: +Auto-Instrumentation (also known as Signals) works on top of Analytics. Make sure to add the following dependency to your project if you don't have analytics-swift already. - ```zsh - https://github.com/segment-integrations/analytics-swift-live.git +```swift +dependencies: [ + .package(url: "https://github.com/segmentio/analytics-swift.git", from: "1.9.1") +] +``` + +## Step 1: Getting started + +1. Add AnalyticsLive to your Swift Package dependencies: + ```swift + dependencies: [ + .package(url: "https://github.com/segmentio/analytics-live-swift.git", from: "3.2.1") + ] ``` -2. Add the initialization code and configuration options: +2. Import and initialize with your Analytics instance. For a complete list, see [configuration options](#configuration-options). + ```swift + import Segment + import AnalyticsLive + + let analytics = Analytics(configuration: Configuration(writeKey: "YOUR_WRITE_KEY")) + + // Add LivePlugins first + analytics.add(plugin: LivePlugins()) + + // Add Signals + analytics.add(plugin: Signals.shared) + + // Configure Signals + Signals.shared.useConfiguration(SignalsConfiguration( + writeKey: "YOUR_WRITE_KEY", // Same writeKey as Analytics + useUIKitAutoSignal: true, + useSwiftUIAutoSignal: true, + useNetworkAutoSignal: true, + #if DEBUG + // NOTE: See section below on using these flags appropriately. + sendDebugSignalsToSegment: true, // Only true for development + obfuscateDebugSignals: false // Only false for development + #endif + // ... other options + )) + ``` +3. Set up capture for the UI framework(s) you're using: + * [Capture SwiftUI interactions](#swiftui). + * [Capture UIKit interactions](#uikit). + * [Capture network activity](#capture-network). + + +## Step 2: Additional setup -> success "" -> See [configuration options](#configuration-options) for a complete list. +### Capture interactions +#### SwiftUI + +SwiftUI automatic signal capture requires adding typealiases to your code. This is necessary because SwiftUI doesn't provide hooks for automatic instrumentation. + +1. Enable SwiftUI auto-signals in your configuration: ```swift - // Configure Analytics with your settings - {... ....} + Signals.shared.useConfiguration(SignalsConfiguration( + writeKey: "YOUR_WRITE_KEY", + useSwiftUIAutoSignal: true + // ... other options + )) + ``` - // Set up the Signals SDK configuration - let config = Signals.Configuration( - writeKey: "", // Replace with the write key you previously copied - maximumBufferSize: 100, - useSwiftUIAutoSignal: true, - useNetworkAutoSignal: true - ) +2. Add the following typealiases to your SwiftUI views or in a shared file: + ```swift + import SwiftUI + import AnalyticsLive + + // Navigation + typealias NavigationLink = SignalNavigationLink + typealias NavigationStack = SignalNavigationStack // iOS 16+ + + // Selection & Input Controls + typealias Button = SignalButton + typealias TextField = SignalTextField + typealias SecureField = SignalSecureField + typealias Picker = SignalPicker + typealias Toggle = SignalToggle + typealias Slider = SignalSlider // Not available on tvOS + typealias Stepper = SignalStepper // Not available on tvOS + + // List & Collection Views + typealias List = SignalList + ``` - // Locate and set the fallback JavaScript file for edge functions - let fallbackURL = Bundle.main.url(forResource: "MyEdgeFunctions", withExtension: "js") +3. Use the controls in your SwiftUI code: + ```swift + struct ContentView: View { + var body: some View { + NavigationStack { + VStack { + Button("Click Me") { + // Button tap automatically generates a signal + } + + TextField("Enter text", text: $text) + // Text changes automatically generates signals + } + } + } + } + ``` - // Apply the configuration and add the Signals plugin - Signals.shared.useConfiguration(config) - Analytics.main.add(plugin: LivePlugins(fallbackFileURL: fallbackURL)) - Analytics.main.add(plugin: Signals.shared) +The typealiases replace SwiftUI's native controls with signal-generating versions. Your code remains unchanged, but interactions are now automatically captured. + +#### UIKit + +UIKit automatic signal capture uses method swizzling and requires no code changes. + +1. Enable UIKit auto-signals in your configuration: + ```swift + Signals.shared.useConfiguration(SignalsConfiguration( + writeKey: "YOUR_WRITE_KEY", + useUIKitAutoSignal: true + // ... other options + )) + ``` + +2. The following UIKit interactions and navigation events are automatically captured via method swizzling: + + **Interactions:** + - `UIButton` taps + - `UISlider` value changes + - `UIStepper` value changes + - `UISwitch` toggle events + - `UITextField` text changes + - `UITableViewCell` selections + + **Navigation:** + - `UINavigationController` push/pop operations + - `UIViewController` modal presentations and dismissals + - `UITabBarController` tab switches + +### Capture navigation + +Navigation capture is handled automatically when you enable SwiftUI or UIKit auto-signals: + +- **SwiftUI**: Captured through `SignalNavigationLink` and `SignalNavigationStack` when you add the typealiases. +- **UIKit**: Captured automatically via `UINavigationController`, `UIViewController`, and `UITabBarController` swizzling. + +No additional setup is required beyond enabling the appropriate auto-signal flags. + +### Capture network + +Network capture automatically tracks URLSession requests and responses. + +1. Enable network auto-signals in your configuration: + ```swift + Signals.shared.useConfiguration(SignalsConfiguration( + writeKey: "YOUR_WRITE_KEY", + useNetworkAutoSignal: true, + allowedNetworkHosts: ["*"], // Allow all hosts (default) + blockedNetworkHosts: [] // Block specific hosts (optional) + // ... other options + )) ``` -Verify that you replaced `` with the actual write key you copied in [Step 1](#step-1-get-your-source-write-key). +2. Network requests made via URLSession are automatically captured, including: + - Request URL, method, headers, and body. + - Response status, headers, and body. + - Request or response correlation via request ID. + +Third-party networking libraries that use URLSession underneath (like Alamofire) should work automatically. Segment API endpoints are automatically blocked to prevent recursive tracking. -#### SwiftUI projects +#### Configuring network hosts -If your app is written in SwiftUI, you need to add a `TypeAlias.swift` file to your project that captures interaction and navigation Signals, like in this example: +You can control which network requests are tracked: ```swift -import Foundation -import Signals - -typealias Button = SignalButton -typealias NavigationStack = SignalNavigationStack -typealias NavigationLink = SignalNavigationLink -typealias TextField = SignalTextField -typealias SecureField = SignalSecureField +SignalsConfiguration( + writeKey: "YOUR_WRITE_KEY", + useNetworkAutoSignal: true, + allowedNetworkHosts: ["api.myapp.com", "*.example.com"], // Only track these hosts + blockedNetworkHosts: ["analytics.google.com"] // Exclude these hosts +) ``` -## Step 3: Turn on Auto-Instrumentation in your source +- `allowedNetworkHosts`: Array of host patterns to track. Use `"*"` to allow all hosts (default). +- `blockedNetworkHosts`: Array of host patterns to exclude from tracking. + +The following hosts are automatically blocked to prevent recursive tracking: +- `api.segment.com` +- `cdn-settings.segment.com` +- `signals.segment.com` +- `api.segment.build` +- `cdn.segment.build` +- `signals.segment.build` + +## Step 3: Enable debug mode + +By default, Signals stores captured data on the device and doesn't forward it to Segment. This process prevents unnecessary bandwidth use and helps support privacy compliance requirements. + +To view captured signals in the Event Builder and create event generation rules, enable `sendDebugSignalsToSegment`. This setting temporarily lets the SDK send signal data to Segment while you're testing. + +In addition, the SDK obfuscates signals sent to Segment by default. To view the completed data, you need to turn off `obfuscateDebugSignals`. + +> warning "" +> Only enable `sendDebugSignalsToSegment` in development environments. Avoid using `sendDebugSignalsToSegment` in production apps. + +You can enable `sendDebugSignalsToSegment` and turn off `obfuscateDebugSignals` in one of three ways. + +### Option 1: Use build configurations to toggle debug mode + + 1. Define different configurations in your project settings (for example, Debug or Release). + + 2. Use compiler flags to control the setting: + ```swift + Signals.shared.useConfiguration(SignalsConfiguration( + writeKey: "YOUR_WRITE_KEY", + // ... other config options + #if DEBUG + sendDebugSignalsToSegment: true, + obfuscateDebugSignals: false + #else + sendDebugSignalsToSegment: false, + obfuscateDebugSignals: true + #endif + )) + ``` + +### Option 2: Use a Feature Flag system + If you're using Firebase Remote Config or a similar feature flag system, you can dynamically control `sendDebugSignalsToSegment` and `obfuscateDebugSignals` without requiring a new app build: + ```swift + let remoteConfig = RemoteConfig.remoteConfig() + + Signals.shared.useConfiguration(SignalsConfiguration( + writeKey: "YOUR_WRITE_KEY", + // ... other config options + sendDebugSignalsToSegment: remoteConfig["sendDebugSignalsToSegment"].boolValue, + obfuscateDebugSignals: remoteConfig["obfuscateDebugSignals"].boolValue + )) + ``` + +### Option 3: Use environment variables (for debugging or testing) + You can check for environment variables or launch arguments during development: + ```swift + let isDebugEnabled = ProcessInfo.processInfo.environment["SIGNALS_DEBUG"] != nil + + Signals.shared.useConfiguration(SignalsConfiguration( + writeKey: "YOUR_WRITE_KEY", + // ... other config options + sendDebugSignalsToSegment: isDebugEnabled, + obfuscateDebugSignals: !isDebugEnabled + )) + ``` + +## Step 4: Turn on Auto-Instrumentation in your source Next, return to the source settings to turn on Auto-Instrumentation: 1. Go to **Connections > Sources**. -2. Select the source you used in Step 1. +2. Select the source you used in [Step 1](#step-1-getting-started). 3. From the source's overview tab, go to **Settings > Advanced**. 4. Toggle Auto-Instrumention on. -## Step 4: Verify and deploy events +## Step 5: Verify and deploy events After integrating the SDK and running your app, verify that Segment is collecting signals: @@ -91,26 +292,30 @@ After integrating the SDK and running your app, verify that Segment is collectin 2. In the source overview, look for the **Event Builder** tab. If the tab doesn’t appear: - Make sure you've installed the SDK correctly. - Reach out to your Segment CSM to confirm that your workspace has the necessary feature flags enabled. -3. Launch your app [in debug mode](https://github.com/segmentio/analytics-next/tree/master/packages/signals/signals#sending-and-viewing-signals-on-segmentcom-debug-mode){:target="_blank"}. This enables signal collection so you can see activity in the Event Builder. +3. If `sendDebugSignalsToSegment` is enabled, Signals appear in real time in the Event Builder as you interact with the app. 4. Use the app as a user would: navigate between screens, tap buttons, trigger network requests. Signals appear in real time as you interact with the app. 5. In the Event Builder, find a signal and click **Configure event** to define a new event. After configuring the event, click **Publish event rules**. ## Configuration options -Using the Signals Configuration object, you can control the destination, frequency, and types of signals that Segment automatically tracks within your application. The following table details the configuration options for Signals-Swift. - -| `Option` | Required | Value | Description | -| ---------------------- | -------- | -------------------------- | ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------ | -| `writeKey` | Yes | String | Source write key | -| `maximumBufferSize` | No | Integer | The number of signals to be kept for JavaScript inspection. This buffer is first-in, first-out. Default is `1000`. | -| `relayCount` | No | Integer | Relays signals to Segment every Xth event. Default is `20`. | -| `relayInterval` | No | TimeInterval | Relays signals to segment every X seconds. Default is `60`. | -| `broadcasters` | No | `SignalBroadcaster` | An array of broadcasters. These objects forward signal data to their destinations, like `WebhookBroadcaster` or `DebugBroadcaster` writing to the developer console. Default is `SegmentBroadcaster`. | -| `useUIKitAutoSignal` | No | Bool | Tracks UIKit component interactions automatically. Default is `false`. | -| `useSwiftUIAutoSignal` | No | Bool | Tracks SwiftUI component interactions automatically. Default is `false`. | -| `useNetworkAutoSignal` | No | Bool | Tracks network events automatically. Default is `false`. | -| `allowedNetworkHosts` | No | Array | An array of allowed network hosts. | -| `blockedNetworkHosts` | No | Array | An array of blocked network hosts. +Using the Signals Configuration object, you can control the destination, frequency, and types of signals that Segment automatically tracks within your application. The following table details the configuration options for Signals-Swift: + + +| OPTION | REQUIRED | VALUE | DESCRIPTION | +|------------------|----------|---------------------------|-------------| +| **writeKey** | Yes | String | Your Segment write key. Should match your Analytics instance writeKey. | +| **maximumBufferSize** | No | Int | The number of signals to be kept for JavaScript inspection. This buffer is first-in, first-out. Default is **1000**. | +| **relayCount** | No | Int | Relays every X signals to Segment. Default is **20**. | +| **relayInterval** | No | TimeInterval | Relays signals to Segment every X seconds. Default is **60**. | +| **broadcasters** | No | SignalBroadcaster | An array of broadcasters. These objects forward signal data to their destinations, like **WebhookBroadcaster**, or you could write your own **DebugBroadcaster** that writes logs to the developer console. **SegmentBroadcaster** is always added by the SDK when `sendDebugSignalsToSegment` is true. | +| **sendDebugSignalsToSegment** | No | Bool | Turns on debug mode and allows the SDK to relay Signals to Segment server. Default is **false**. It should only be set to true for development purposes. | +| **obfuscateDebugSignals** | No | Bool | Obfuscates signals being relayed to Segment. Default is **true**. | +| **apiHost** | No | String | API host for signal relay. Default is **"signals.segment.io/v1"**. | +| **useUIKitAutoSignal** | No | Bool | Enables automatic UIKit signal capture via method swizzling. Default is **false**. | +| **useSwiftUIAutoSignal** | No | Bool | Enables automatic SwiftUI signal capture (requires typealiases). Default is **false**. | +| **useNetworkAutoSignal** | No | Bool | Enables automatic network signal capture for URLSession. Default is **false**. | +| **allowedNetworkHosts** | No | String | Array of host patterns to track. Use `["*"]` for all hosts. Default is **["*"]**. | +| **blockedNetworkHosts** | No | String | Array of host patterns to exclude from tracking. Default is **[]**. | ## Next steps diff --git a/src/connections/delivery-overview.md b/src/connections/delivery-overview.md index cc64ab5474..42eaf6470e 100644 --- a/src/connections/delivery-overview.md +++ b/src/connections/delivery-overview.md @@ -48,14 +48,25 @@ The pipeline view for storage destination includes the following steps: - **Failed on ingest**: Events that Segment received, but were dropped due to internal data validation rules. - **Filtered at source**: Events that were discarded due to schema settings or [Protocols](/docs/protocols/) Tracking Plans. - **Filtered at destination**: Events that were discarded due to [Destination Filters](/docs/guides/filtering-data/#destination-filters), [filtering in the Integrations object](/docs/guides/filtering-data/#filtering-with-the-integrations-object), [Destination Insert functions](/docs/connections/functions/insert-functions/), or [per source schema integration filters](/docs/guides/filtering-data/#per-source-schema-integrations-filters). [Actions destinations](/docs/connections/destinations/actions/) also have a filtering capability: for example, if your Action is set to only send Identify events, all other event types will be filtered out. Actions destinations with incomplete triggers or disabled mappings are filtered out at this step. [Consent Management](/docs/privacy/consent-management/) users also see events discarded due to consent preferences. -- **Events to warehouse rows**: A read-only box that shows the point in the delivery process where Segment converts events into warehouse rows. +- **Events pending retry**: A read-only box that shows the number of events that are awaiting retry. - **Failed to sync**: Syncs that either failed to sync or were partially successful. Selecting this step takes you to a table of all syncs with one or more failed collections. Select a sync from the table to view the discard reason, any collections that failed, the status, and the number of rows that synced for each collection. For information about common errors, see Ware -- **Successfully synced**: A record of all successful or partially successful syncs made with your destination. To view the reason a partially successfully sync was not fully successful, see the Failed to sync step. +- **Successfully synced**: A record of all successful or partially successful syncs made with your destination. To view the reason a partially successfully sync was not fully successful, see the Failed to sync step. The following image shows a storage destination with 23 partially successful syncs: ![A screenshot of the Delivery Overview tab for a Storage destination, with the Failed to sync step selected and a table of partially successful syncs.](images/delivery-overview-storage-destinations.png) +##### Linked Audiences to Snowflake destination + +You can view information about events sent from Linked Audiences downstream to the Snowflake destination with the following pipeline view: + +- **Events from audience**: Events that Segment created for your activation. The number of events for each compute depends on the changes detected in your audience membership. +- **Filtered at source**: Events discarded by Protocols: either by the [schema settings](/docs/protocols/enforce/schema-configuration/) or [Tracking Plans](/docs/protocols/tracking-plan/create/). +- **Filtered at destination**: Events that were discarded due to [Destination Filters](/docs/guides/filtering-data/#destination-filters), [filtering in the Integrations object](/docs/guides/filtering-data/#filtering-with-the-integrations-object), [Destination Insert functions](/docs/connections/functions/insert-functions/), or [per source schema integration filters](/docs/guides/filtering-data/#per-source-schema-integrations-filters). [Actions destinations](/docs/connections/destinations/actions/) also have a filtering capability: for example, if your Action is set to only send Identify events, all other event types will be filtered out. Actions destinations with incomplete triggers or disabled mappings are filtered out at this step. [Consent Management](/docs/privacy/consent-management/) users also see events discarded due to consent preferences. +- **Events pending retry**: A read-only box that shows the number of events that are awaiting retry. +- **Failed delivery**: Events that have been discarded due to errors or unmet destination requirements. Select a discard reason from the table to view all events that failed, sorted by collection. For information about common errors, see [Warehouse Errors](/docs/connections/storage/warehouses/warehouse-errors). +- **Successful delivery**: Events that were successfully delivered to Snowflake. + #### Destinations connected to Engage Destinations > info "Delivery Overview for Engage Destinations is in Public Beta" @@ -151,4 +162,4 @@ The Delivery Overview pipeline steps Failed on Ingest, Filtered at Source, Filte This table provides a list of all possible discard reasons available at each pipeline step. {% include content/delivery-overview-discards.html %} - + \ No newline at end of file diff --git a/src/connections/destinations/catalog/actions-braze-cloud/index.md b/src/connections/destinations/catalog/actions-braze-cloud/index.md index ca025d7370..5c731999a2 100644 --- a/src/connections/destinations/catalog/actions-braze-cloud/index.md +++ b/src/connections/destinations/catalog/actions-braze-cloud/index.md @@ -16,7 +16,7 @@ versions: [Braze](https://www.braze.com/){:target="_blank"}, formerly Appboy, is an engagement platform that empowers growth by helping marketing teams to build customer loyalty through mobile, omni-channel customer experiences. > success "" -> **Good to know**: This page is about the [Actions-framework](/docs/connections/destinations/actions/) Braze Segment destination. There's also a page about the [non-Actions Braze destination](/docs/connections/destinations/catalog/braze/). Both of these destinations receives data _from_ Segment. There's also the [Braze source](/docs/connections/sources/catalog/cloud-apps/braze//), which sends data _to_ Segment! +> **Good to know**: This page is about the [Actions-framework](/docs/connections/destinations/actions/) Braze Segment destination. There's also a page about the [non-Actions Braze destination](/docs/connections/destinations/catalog/braze/). Both of these destinations receives data _from_ Segment. There's also the [Braze source](/docs/connections/sources/catalog/cloud-apps/braze/), which sends data _to_ Segment! ## Benefits of Braze Cloud-Mode (Actions) vs Braze (Classic) @@ -77,4 +77,4 @@ Braze requires one of either `external_id`, `user_alias`, or `braze_id` to be pr #### Missing events When an event is sent under an alias, it might appear to be missing if the alias cannot be found in Braze. This might be due to an incorrect search for the alias in Braze. -To search for an alias in Braze, use the format `Alias Label:Alias Name`. For example, if the "Alias Label" field is set as email and "Alias Name" field is set as email address, for example: "test@email.com", you should search for the alias using "email:test@email.com". \ No newline at end of file +To search for an alias in Braze, use the format `Alias Label:Alias Name`. For example, if the "Alias Label" field is set as email and "Alias Name" field is set as email address, for example: "test@email.com", you should search for the alias using "email:test@email.com". diff --git a/src/connections/destinations/catalog/actions-ms-bing-capi/index.md b/src/connections/destinations/catalog/actions-ms-bing-capi/index.md index 758f4c835e..1fc8dc4670 100644 --- a/src/connections/destinations/catalog/actions-ms-bing-capi/index.md +++ b/src/connections/destinations/catalog/actions-ms-bing-capi/index.md @@ -16,8 +16,8 @@ redirect_from: "/connections/destinations/catalog/microsoft-bing-capi/" 3. Select an existing source to connect to the destination. 4. Give the destination a name and click **Create destination**. 5. In **Basic Settings**, enter the Bing **UetTag** and **ApiToken**. - > * To find the UET tag, see Microsoft's steps on [how to create a UET tag](https://help.ads.microsoft.com/#apex/3/en/56682/2-500){:target="_blank"}. - > * To generate the API token, contact [Microsoft support](https://about.ads.microsoft.com/en/support){:target="_blank"} or fill out a [request form](https://forms.office.com/Pages/ResponsePage.aspx?id=v4j5cvGGr0GRqy180BHbRwMZAe0PcMxHmZ0AjDaNRmxUM0o5UURRVktCRkxHNEFLTVNYQjI3NDNBUS4u){:target="_blank"}. + * To find the UET tag, see Microsoft's steps on [how to create a UET tag](https://help.ads.microsoft.com/#apex/3/en/56682/2-500){:target="_blank"}. + * To generate the API token, contact [Microsoft support](https://about.ads.microsoft.com/en/support){:target="_blank"} or fill out a [request form](https://forms.office.com/Pages/ResponsePage.aspx?id=v4j5cvGGr0GRqy180BHbRwMZAe0PcMxHmZ0AjDaNRmxUM0o5UURRVktCRkxHNEFLTVNYQjI3NDNBUS4u){:target="_blank"}. 6. Toggle **Enable Destination** on to start sending data to Microsoft Bing CAPI from Segment. {% include components/actions-fields.html %} diff --git a/src/connections/destinations/catalog/actions-s3/index.md b/src/connections/destinations/catalog/actions-s3/index.md index db3ab8a919..e1de9ce71d 100644 --- a/src/connections/destinations/catalog/actions-s3/index.md +++ b/src/connections/destinations/catalog/actions-s3/index.md @@ -1,8 +1,8 @@ --- title: AWS S3 (Actions) Destination -hide-boilerplate: true -hide-dossier: false id: 66eaa166f650644f04389e2c +hide-boilerplate: true +hide-dossier: true private: true beta: true # versions: diff --git a/src/connections/destinations/catalog/actions-stackadapt-audiences/index.md b/src/connections/destinations/catalog/actions-stackadapt-audiences/index.md index 32e805c715..9c84bb6a48 100644 --- a/src/connections/destinations/catalog/actions-stackadapt-audiences/index.md +++ b/src/connections/destinations/catalog/actions-stackadapt-audiences/index.md @@ -9,7 +9,7 @@ redirect_from: "/connections/destinations/catalog/actions-stackadapt/" {% include content/plan-grid.md name="actions" %} -[StackAdapt](https://www.stackadapt.com/){:target="_blank"} is a leading programmatic advertising platform designed to maximize audience engagement. It enables marketers to run high-performing, cross-channel campaigns through real-time bidding, advanced audience targeting, and powerful data-driven insights. +[StackAdapt](https://www.stackadapt.com/){:target="\_blank"} is a leading programmatic advertising platform designed to maximize audience engagement. It enables marketers to run high-performing, cross-channel campaigns through real-time bidding, advanced audience targeting, and powerful data-driven insights. With the [Engage](/docs/engage/) integration, you can seamlessly sync your Engage Audiences and user data with StackAdapt to refine targeting precision and drive stronger campaign performance. @@ -60,6 +60,9 @@ Each Engage audience should only contain profiles that have a valid email addres - Select the source field for `Standard User Properties`. Ensure the source field matches the profile traits selected in step 4. You can learn more about the field format by hovering over the info icon of the field. - Follow the Destinations Actions documentation to [customize mappings](/docs/connections/destinations/actions/#customize-mappings). +> note "Trait synchronization" +> Both Custom, Computed, and Consent Traits are mapped and included in the initial data synchronization. However, for ongoing updates, please be aware that only Computed Traits will be updated within StackAdapt's Data Hub. + To verify that your audience syncs with StackAdapt, open StackAdapt and navigate to **Audience & Attribution > Customer Data > Profiles**. On the Profiles tab, you should be able to see a list of profiles being synced to the StackAdapt platform. > info "Syncs can take up to 4 hours" diff --git a/src/connections/destinations/catalog/criteo-app-web-events/index.md b/src/connections/destinations/catalog/criteo-app-web-events/index.md index 99da7bddaa..17c13a7c82 100644 --- a/src/connections/destinations/catalog/criteo-app-web-events/index.md +++ b/src/connections/destinations/catalog/criteo-app-web-events/index.md @@ -395,4 +395,4 @@ Criteo Events can receive dates in a specific format, in order for us to pass al ### Is the mobile integration bundled? -Even though we don't support integrating with Criteo Events using Segment from a server source, it's still not necessary for you to [bundle](/docs/connections/spec/mobile-packaging-sdks//) the Criteo Events SDK into the Segment SDK! This is because while our mobile integration with them is powered from our servers, the integration requires metadata that can only be supplied by the user's mobile device (which is collected and passed along automatically by the Segment mobile SDK). +Even though we don't support integrating with Criteo Events using Segment from a server source, it's still not necessary for you to [bundle](/docs/connections/spec/mobile-packaging-sdks/) the Criteo Events SDK into the Segment SDK! This is because while our mobile integration with them is powered from our servers, the integration requires metadata that can only be supplied by the user's mobile device (which is collected and passed along automatically by the Segment mobile SDK). diff --git a/src/connections/destinations/catalog/satismeter/index.md b/src/connections/destinations/catalog/satismeter/index.md index a56505fe7e..82da8e3ce6 100644 --- a/src/connections/destinations/catalog/satismeter/index.md +++ b/src/connections/destinations/catalog/satismeter/index.md @@ -1,6 +1,7 @@ --- title: SatisMeter Destination id: 54c02a5adb31d978f14a7f6f +hide-dossier: true --- [Our SatisMeter destination code](https://github.com/segment-integrations/analytics.js-integration-satismeter){:target="_blank"} is all open-source on GitHub if you want to check it out. diff --git a/src/connections/destinations/catalog/wishpond/index.md b/src/connections/destinations/catalog/wishpond/index.md index 8ec5082834..427ac22492 100644 --- a/src/connections/destinations/catalog/wishpond/index.md +++ b/src/connections/destinations/catalog/wishpond/index.md @@ -75,6 +75,6 @@ To more details how Wishpond's identify works visit [Wishpond API Docs: #track] Make sure you have copied the right keys from Wishpond's ["API Keys" dialog](https://www.wishpond.com/central/welcome?api_keys=true){:target="_blank"}, this destination will need `Merchant ID` and `Tracking Key`. -[Analytics.js]: https://segment.com//docs/connections/sources/catalog/libraries/website/javascript/ +[Analytics.js]: https://segment.com/docs/connections/sources/catalog/libraries/website/javascript/ [ci-link]: https://circleci.com/gh/segment-integrations/analytics.js-integration-wishpond [ci-badge]: https://circleci.com/gh/segment-integrations/analytics.js-integration-wishpond.svg?style=svg diff --git a/src/connections/sources/catalog/cloud-apps/rise-ai/index.md b/src/connections/sources/catalog/cloud-apps/rise-ai/index.md index e2375e3b60..ba9a901bd7 100644 --- a/src/connections/sources/catalog/cloud-apps/rise-ai/index.md +++ b/src/connections/sources/catalog/cloud-apps/rise-ai/index.md @@ -32,8 +32,8 @@ The table below lists events that Rise AI sends to Segment. These events appear | Event name | Description | | ------------------- | ------------------------------------------------------------------ | -| walkthrough-progress | User progress through AI-guided walkthroughs and onboarding flows | -| chats | AI chat session creation and interactions | +| Walkthrough Progress | User progress through AI-guided walkthroughs and onboarding flows | +| Chats | AI chat session creation and interactions | ## Event properties diff --git a/src/connections/sources/catalog/libraries/mobile/react-native/destination-plugins/index.md b/src/connections/sources/catalog/libraries/mobile/react-native/destination-plugins/index.md index ec31175aeb..ca542e31cc 100644 --- a/src/connections/sources/catalog/libraries/mobile/react-native/destination-plugins/index.md +++ b/src/connections/sources/catalog/libraries/mobile/react-native/destination-plugins/index.md @@ -51,13 +51,13 @@ plugins: mark: url: https://cdn.filepicker.io/api/file/k1fi9InSu6eint2IHilP - name: Firebase - url: connections/sources/catalog/libraries/mobile/react-native//destination-plugins/firebase-react-native/ + url: connections/sources/catalog/libraries/mobile/react-native/destination-plugins/firebase-react-native/ logo: url: https://cdn.filepicker.io/api/file/W6teayYkRmKgb8SMqxIn mark: url: https://cdn.filepicker.io/api/file/ztKtaLBUT7GUZKius5sa - name: FullStory - url: connections/sources/catalog/libraries/mobile/react-native//destination-plugins/fullstory-react-native/ + url: connections/sources/catalog/libraries/mobile/react-native/destination-plugins/fullstory-react-native/ logo: url: https://cdn.filepicker.io/api/file/0ET4vgkqTGNMRtZcFWCA mark: diff --git a/src/connections/sources/catalog/libraries/server/net/quickstart.md b/src/connections/sources/catalog/libraries/server/net/quickstart.md index 937f737bb9..23980d31ad 100644 --- a/src/connections/sources/catalog/libraries/server/net/quickstart.md +++ b/src/connections/sources/catalog/libraries/server/net/quickstart.md @@ -106,6 +106,6 @@ Congratulations! You can now track any event from the browser and the backend. H ## What's Next? -We just walked through the quickest way to get started with Segment using Analytics.js and the .NET library. You might also want to check out our full [Analytics.js reference](//docs/connections/sources/catalog/libraries/website/javascript/) to see what else is possible, or read about the [Tracking API methods](/docs/connections/spec/) to get a sense for the bigger picture. +We just walked through the quickest way to get started with Segment using Analytics.js and the .NET library. You might also want to check out our full [Analytics.js reference](/docs/connections/sources/catalog/libraries/website/javascript/) to see what else is possible, or read about the [Tracking API methods](/docs/connections/spec/) to get a sense for the bigger picture. If you're running an **Ecommerce** site or app you should also check out our [Ecommerce API reference](/docs/connections/spec/ecommerce/v2/) to make sure your products and checkout experience is instrumented properly! diff --git a/src/connections/sources/catalog/libraries/server/node/migration.md b/src/connections/sources/catalog/libraries/server/node/migration.md index b250ad9a93..3774eb04b4 100644 --- a/src/connections/sources/catalog/libraries/server/node/migration.md +++ b/src/connections/sources/catalog/libraries/server/node/migration.md @@ -60,7 +60,7 @@ If you're using the [classic version of Analytics Node.js](/docs/connections/sou #### Removals The updated Analytics Node.js removed these configuration options: -- `errorHandler` (see the docs on [error handling](/docs/connections/sources/catalog/libraries/server/node//#error-handling) for more information) +- `errorHandler` (see the docs on [error handling](/docs/connections/sources/catalog/libraries/server/node/#error-handling) for more information) The updated Analytics Node.js library removed undocumented behavior around `track` properties diff --git a/src/connections/spec/group.md b/src/connections/spec/group.md index 98f1ebd55b..369d683ea5 100644 --- a/src/connections/spec/group.md +++ b/src/connections/spec/group.md @@ -8,15 +8,14 @@ The Group call enables you to identify what account or organization your users a {% include components/reference-button.html href="https://university.segment.com/introduction-to-segment/324252?reg=1&referrer=docs" icon="media/academy.svg" title="Segment University: The Segment Methods" description="Check out our high-level overview of these APIs in Segment University. (Must be logged in to access.)" %} -In addition to the `groupId`, which is how you'd identify the specific group or company, the group method receives traits that are specific to the group, like industry or number of employees for example, that belong to that specific account. Like the traits of an identify call, you can update these when you call the same trait with a different value. +In addition to the `groupId`, which is how you'd identify the specific group or company, the group method receives traits that are specific to the group, for example the industry or number of employees, that belong to that specific account. Like the traits of an Identify call, you can update these when you call the same trait with a different value. When using the Group call, it's helpful if you have accounts with multiple users. - -> info "Segment doesn't have an ungroup call" -> If you're using a device-mode destination that has a method for ungrouping users, you can invoke it directly on the client side [using Segment's ready() method](/docs/connections/sources/catalog/libraries/website/javascript/#ready). +> info "Segment doesn't have a native solution for removing users from a group" +> If you're using a device-mode destination that has a method for removing users from a group, you can invoke it directly on the client side [using Segment's ready() method](/docs/connections/sources/catalog/libraries/website/javascript/#ready). > -> For cloud-mode destinations, you can [create a Destination Function](/docs/connections/functions/destination-functions/) to ungroup users. +> For cloud-mode destinations, you can [create a Destination Function](/docs/connections/functions/destination-functions/) to remove users from a group. Here's the payload of a typical Group call, with most [common fields](/docs/connections/spec/common/) removed: @@ -106,7 +105,7 @@ Use the following interactive code pen to see what your Group calls would look l ## Group ID -A Group ID is the unique identifier which you recognize a group by in your own database. For example, if you're using MongoDB it might look something like `507f191e810c19729de860ea`. +A Group ID is the unique identifier which you recognize a group by in your own database. For example, if you're using MongoDB, your Group ID might look something like `507f191e810c19729de860ea`. ## Traits @@ -132,10 +131,9 @@ The following are the reserved traits Segment has standardized: | `website` | String | Website of a group. | | `plan` | String | Plan that a group is in. | -**Note:** You might be used to some destinations recognizing special properties differently. For example, Mixpanel has a special `track_charges` method for accepting revenue. Luckily, you don't have to worry about those inconsistencies. Just pass along `revenue`. **Segment handles all of the destination-specific conversions for you automatically.** Same goes for the rest of the reserved properties. +**Note:** You might be used to some destinations recognizing special properties differently. For example, Mixpanel has a special `track_charges` method for accepting revenue. **Segment handles all of the destination-specific conversions for you automatically.** -If you pass these values, `on null` will throw a `NullPointerException`. +If you pass a reserved trait, `on null` will throw a `NullPointerException`. You may continue to set values inside the trait. If you do so, this would work the same as the rules do with NoSQL data. If you had set a value previously for a user and on the next request you sent the same value of that property as `on null`, it will be replaced by `null`, but if you do not send that property, the original value is persisted. -**Traits are case-insensitive**, so in JavaScript you can match the rest of your camel-case code by sending `createdAt`, and in Ruby you can match your snake-case code by sending `created_at`. That way the API never seems alien to your code base. - +**Traits are case-insensitive**. In JavaScript you can match the rest of your camel-case code by sending `createdAt`, and in Ruby you can match your snake-case code by sending `created_at`. That way, the API never seems alien to your code base. \ No newline at end of file diff --git a/src/engage/journeys/images/path_joins.png b/src/engage/journeys/images/path_joins.png new file mode 100644 index 0000000000..5e98d61470 Binary files /dev/null and b/src/engage/journeys/images/path_joins.png differ diff --git a/src/engage/journeys/v2/event-triggered-journeys-steps.md b/src/engage/journeys/v2/event-triggered-journeys-steps.md index 0eb237231a..0f95bb68d4 100644 --- a/src/engage/journeys/v2/event-triggered-journeys-steps.md +++ b/src/engage/journeys/v2/event-triggered-journeys-steps.md @@ -170,11 +170,26 @@ You can also give branches uniques name to differentiate them from each other on > info "Evaluation is sequential" > Segment evaluates branches in the order they appear in the configuration side sheet. If a profile qualifies for multiple branches, Segment sends it down the first one it matches. Profiles can't qualify for more than one branch, and Segment doesn't wait for audience membership to update after the profile enters the step. You can change the evaluation order by dragging branches up or down in the configuration side sheet. +### Branch on journey context + +Data split branches can evaluate conditions based on event properties stored in the journey context. This lets you route journey instances based on real-time event data instead of static profile information. + +When you configure a branch with journey context conditions: + +1. Select the event object from journey context. + - The triggering event is always available, and any events from Hold until steps on the current path also show up. +2. Choose the specific property from that event. +3. Define the condition and value. + +Segment shows only event context available on the journey path leading to the Data split step. If an event was captured in a Hold until step on a different branch, it won't appear as an option for conditions on the current branch. + +You can combine journey context conditions with trait-based and audience-based conditions in the same branch. Segment evaluates all conditions using `AND` logic, so the journey instance must satisfy every condition to follow that branch. + ### Example: Target different customer types or event properties -You can use a Data split to branch profiles based on event properties, traits, or audience membership that already exist on the profile when it reaches this step. For example: +You can use a Data split to branch journey instances based on event properties from journey context, profile traits, or audience membership. For example: -- Journey instances where the triggering event had a `transaction_total` > $100 are sent specific messaging about their high-ticket purchase. +- Journey instances where the triggering event's `transaction_total` property is greater than $100 receive high-value purchase messaging. - Profiles with a known `email_subscription_status` trait get treated as existing customers. - Profiles that belong to a `VIP` audience are routed down a separate path for high-value users. - Profiles with a specific set of traits (like favorite color and a known name) can receive personalized messaging. @@ -427,3 +442,68 @@ There may be cases where events sent to Segment are missing specific properties - Similarly, if a mapped trait is missing on the profile, the key is included in the payload with a value of `undefined`. Carefully configuring mappings and handling missing attributes can help you maintain data integrity and avoid errors in downstream systems. + +## Reconnect branches with path joins + +Path joins connect a branch of a journey to a step in another branch. This eliminates duplicate steps and saves journey step credits when multiple branches need to converge on the same downstream actions. + +Use path joins when different user segments need different initial treatments but should follow the same steps afterward. For example, high-value customers might receive multiple touchpoints through one branch while standard customers skip directly to a general follow-up step that both groups eventually reach. + +Path joins work well when: + +- Different user segments require unique messaging initially but share common downstream steps +- One branch needs fewer steps than another, and you want both to converge at a specific point +- Multiple branches lead to the same destination send or action step +- You want to reduce journey complexity and avoid duplicating identical steps across branches + +### Create a path join + +To create a path join: + +1. Go to the last step in the branch you want to connect. +2. Click the **+** icon at the end of that branch. +3. Select **Connect path to existing step**. +4. Choose a step from another branch to connect to. Segment only shows the available steps you can connect to. +5. The connection appears as a line on the canvas, showing the path join between branches. + +![Journey canvas showing the Connect path to existing step option in a dropdown menu. The menu appears when clicking a plus icon at the end of a journey branch, showing flow control options like Delay, Hold until, Data split, and Randomized split, along with the Connect path to existing step option at the top.](../images/path_joins.png) + +You can only connect to child steps (steps that come after the split), not parent steps or steps earlier in the journey. Each branch endpoint supports one path join connection. + +### Add steps within a path join + +You can add journey steps between the origin point and the target step of a path join. This lets you include branch-specific actions before profiles merge with the main path. + +To add a step within a path join: + +1. Click the **+** icon along the path join connection line. +2. Select the step type you want to add. +3. Configure the step as needed. + +### Disconnect a path join + +You can remove a path join connection from either end: + +From the origin point: + +1. Click the **+** icon at the start of the path join connection. +2. Select **Disconnect**. +3. Confirm the disconnection in the modal. + +From the target step: + +1. Click the **+** icon at the target step where the path join connects. +2. Select **Disconnect**. +3. Choose which branch path the child steps should move into. +4. Confirm the disconnection. + +> info "" +> You can only edit path join connections while the journey is in Draft state. Published journeys must be edited in a new version to modify path joins. + +### Journey context and path joins + +When branches reconnect through a path join, the journey context for profiles includes only the events from their specific path. Context from steps on other branches is not available. + +For example, if Branch A includes a Hold until step that captures an event, and Branch B connects to a step after that Hold until, profiles from Branch B won't have access to the event context from Branch A's Hold until step. Only profiles that actually passed through Branch A will have that context available. + +When you configure Data split conditions after a path join, Segment dynamically shows only the context available for each possible path leading to that split. \ No newline at end of file diff --git a/src/engage/journeys/v2/limits.md b/src/engage/journeys/v2/limits.md index 0b45249bca..b3b45c1726 100644 --- a/src/engage/journeys/v2/limits.md +++ b/src/engage/journeys/v2/limits.md @@ -7,15 +7,15 @@ This page outlines product limitations for Event-Triggered (V2) Journeys. ## General limits -| Name | Limit | Description | -| ------------------- | --------------------------------------- | -------------------------------------------------------------------------------------------------------------------------- | -| Steps | 50 | Maximum number of steps per journey. | -| Journey name | 73 characters | Maximum length for Journey names. Names must be unique. | -| Step name | 73 characters | Maximum length for step names. | -| Branch name | 73 characters | Maximum length for branch names within a split step. Branch names must be unique across the journey. | -| Additional branches | 5 | Maximum number of branches supported in a split or Hold Until step. | -| Delay duration | Minimum: 5 minutes
Maximum: 182 days | Allowed time range for Delay and Hold Until steps. | -| Unique identifier | 500 characters | For “Re-enter every time event occurs” rules, you must define a unique identifier. The value is limited to 500 characters. | +| Name | Limit | Description | +| ------------------- | --------------------------------------------------------------- | --------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------- | +| Steps | 50 | Maximum number of steps per journey. | +| Journey name | 73 characters | Maximum length for Journey names. Names must be unique. | +| Step name | 73 characters | Maximum length for step names. | +| Branch name | 73 characters | Maximum length for branch names within a split step. Branch names must be unique across the journey. | +| Additional branches | 5 | Maximum number of branches supported in a split or Hold Until step. | +| Delay duration | Minimum: 5 minutes
Maximum: 182 days | Allowed time range for Delay and Hold Until steps. | +| Unique identifier | Property name: 500 characters
Property value: 150 characters | For "Re-enter every time event occurs" rules, you must define a unique identifier. The property name is limited to 500 characters, and the property value is limited to 150 characters. | ## Journey Step Billing @@ -35,7 +35,7 @@ Plans with compute credits instead of journey steps consume 1 compute credit for | Name | Limit | Description | | ------------------------- | ----------------------- | ------------------------------------------------------------------------------------------------------------------------------- | | Requests per second (RPS) | 25 events/sec/profile | Maximum events per second per Segment ID. Timer events are excluded. Excess events get dropped. | -| Instances per profile | 25 concurrent instances | Maximum concurrent Journey instances per profile. | +| Instances per profile | 25 concurrent instances | Maximum concurrent journey instances per profile across all journeys. | | Send profiles back branch | 100 instances | Maximum count a single journey instance can pass through a Wait Until Send profiles back to the beginning of this step' branch. | ## Journey context diff --git a/src/guides/how-to-guides/forecast-with-sql.md b/src/guides/how-to-guides/forecast-with-sql.md index 3d376f2708..227f198900 100644 --- a/src/guides/how-to-guides/forecast-with-sql.md +++ b/src/guides/how-to-guides/forecast-with-sql.md @@ -190,7 +190,7 @@ At Toastmates, most of the highest forward-looking expected LTV customers share With that in mind, you can define a behavioral cohort in our email tool, Customer.io, as well as create a trigger workflow so we can send an email offer to these customers. -[Learn how to use email tools to target this cohort of high value customers.](https://segment.com/docs/guides/how-to-guides/forecast-with-sql//) +[Learn how to use email tools to target this cohort of high value customers.](https://segment.com/docs/guides/how-to-guides/forecast-with-sql/) ## Reward your best customers diff --git a/src/guides/what-is-replay.md b/src/guides/what-is-replay.md index 7db26b3e31..75442342ee 100644 --- a/src/guides/what-is-replay.md +++ b/src/guides/what-is-replay.md @@ -68,7 +68,7 @@ There are two types of replays with Engage. #### Nuances to Consider for Engage Replays **1. Replay a Profile Source's data into Engage Space** -- When a new Profile Source is connected to an Engage Space, the default option to replay the source's data seen over the past 30 days can be selected. To request a source's additional historical data be replayed to the Engage Space, contact Segment Support at friends@segment.com or [create a ticket](https://segment.com/docs/engage/[url](https://app.segment.com/goto-my-workspace/home?period=last-24-hours&v2=enabled&help=create-ticket)). Please see [this documentation](https://segment.com/docs/engage/quickstart/#step-3-connect-production-sources:~:text=Step%203%3A%20Connect,production%20sources.) on further details of this process and what to include in your support request. +- When a new Profile Source is connected to an Engage Space, the default option to replay the source's data seen over the past 30 days can be selected. To request a source's additional historical data be replayed to the Engage Space, contact Segment Support at friends@segment.com or [create a ticket](https://app.segment.com/goto-my-workspace/home?period=last-24-hours&v2=enabled&help=create-ticket). Please see [this documentation](https://segment.com/docs/engage/quickstart/#step-3-connect-production-sources:~:text=Step%203%3A%20Connect,production%20sources.) on further details of this process and what to include in your support request. **2. Replay from an Engage Space to its connected destination** - Since each instance of a destination is connected to its own Engage "Output" source, that source contains events for all of the computations that destination is connected to received data from, _the list of output sources can be found under Unify > Unify Settings > Debugger_. Because of this, it's not possible to replay only a specific computation's data to the destination, you should instead consider reaching out to [Segment support](https://segment.com/help/contact/) to request a resync of that computation to its destination instead. However, if you would like to replay all failed events seen by that destination, which will encompass all connected computations, that can be achieved with a replay. diff --git a/src/unify/identity-resolution/delete-profile-identifier-api.md b/src/unify/identity-resolution/delete-profile-identifier-api.md new file mode 100644 index 0000000000..d4008c4a39 --- /dev/null +++ b/src/unify/identity-resolution/delete-profile-identifier-api.md @@ -0,0 +1,185 @@ +--- +title: Delete Profile Identifier API +plan: unify +hidden: true +--- + +The Delete Profile Identifier API removes identifiers from a profile while preserving the profile's history, including traits, events, and merge history. + +Use this API to clean up outdated or incorrectly added identifiers without deleting entire profiles and replaying events. + +This page explains how to use the API, including how deletions work across Segment systems and what to consider before you begin. + +> info "Delete Profile Identifier API Private Beta" +> The Delete Profile Identifier API is in Private Beta, and Segment is actively working on this feature. Some functionality may change before it becomes generally available. + +## Use cases + +The Delete Profile Identifier API helps you clean up identifiers that shouldn't be associated with a profile, including: + +- Mistakenly imported identifiers, like incorrect email addresses, that prevent accurate targeting in downstream tools +- Obsolete identifiers left over from database migrations or system changes +- Identifiers with a short lifespan that need to transfer between profiles. For example, when a user changes phone numbers or when a prepaid service expires, you can remove the phone number from one profile and add it to another. +- Old identifiers that cause profiles to violate [ID Resolution limits](/docs/unify/product-limits/#identity). +- Extra identifiers from misconfigured identity resolution settings. For example, if you reduced the `user_id` limit from 3 to 1, remove extra `user_id` values to resolve discrepancies between Segment and downstream tools like [Braze](/docs/connections/destinations/catalog/actions-braze-cloud/) or [Amplitude](/docs/connections/destinations/catalog/actions-amplitude/). + +## Before you begin + +> warning "Deletion scope" +> This API removes identifiers from Unify systems only. For complete user data deletion across all Segment systems (required for GDPR, CCPA, and other privacy regulations), see [Segment's user deletion and suppression guidance](/docs/privacy/user-deletion-and-suppression/). + +The Delete Profile Identifier API is available to Unify and Engage customers during private beta. + +You need one of these roles to delete identifiers: + +- Workspace Owner +- Identity Admin +- Unify and Engage Admin + +See [the Roles documentation](/docs/segment-app/iam/roles/) for more details. + +If you use [Profiles Sync](/docs/unify/profiles-sync/overview/), you must also: + +1. Add the `__operation` column to the `external_id_mapping_updates` table schema in your data warehouse: + - Default value: `CREATED` + - Deleted value: `REMOVED` +2. Verify that your analytics workloads (BI tools, data pipelines, ML models) can handle deleted identifiers. Make sure these systems stay operational and account for the `REMOVED` flag. + +## How deletion works + +When you delete an identifier, Segment removes it from [Identity Resolution](/docs/unify/identity-resolution/) and syncs the change to connected systems. + +The API confirms that Segment deleted the identifier from the real-time Identity Graph. The deletion then flows through these systems: + +| System | What happens | +| ---------------------------- | ---------------------------------------------------------------------------------------------------------------------------------------------------------- | +| Real-time Profile storage | The Profile API and Profile explorer delete the identifier in near real time | +| Batch Profile Data lakehouse | Segment soft-deletes the identifier in the append-only table within minutes and filters it from the materialized view within 24 hours | +| Customer data warehouse | Profiles Sync adds a row to `external_id_mapping_updates` with `__operation` set to `REMOVED`. The `user_identifiers` view filters out removed identifiers | + + +## Send a deletion request + +You can only delete identifiers from known profiles. The API requires a valid `user_id` to locate the profile. + +The API returns an error if you try to delete: + +- All `user_id` values from a profile. Profiles must have at least one `user_id`. +- A `group_id` identifier. The API only supports individual profiles. + +### Authentication + +The API uses HTTP Basic Authentication. Base64-encode your access token with a trailing colon (the colon represents an empty password): + +```bash +echo -n 'your_token:' | base64 +``` + +Use the encoded value in the `Authorization` header of your requests. Generate your access token in **Unify > Unify settings > API access**. + +### Request format + +The API accepts one identifier per request. + +Send requests to this endpoint: + +```bash +POST https://{HOST_NAME}/v1/spaces/{SPACE_ID}/collections/users/profiles/user_id:{USER_ID_VALUE}/external_ids/delete +``` + +Replace the following parameters in the URL: + +| Parameter | Description | +| --------------- | ---------------------------------------------------------------------------------------------------- | +| `HOST_NAME` | `profiles.segment.com` for North America workspaces or `profiles.euw1.segment.com` for EU workspaces | +| `SPACE_ID` | Your space ID. Find this in **Unify > Unify settings > API access**. | +| `USER_ID_VALUE` | The `user_id` value that identifies the profile. | + +Include these fields in the request body: + +| Field | Description | +| --------------------- | ------------------------------------------------------------------------------ | +| `delete_external_ids` | Array containing the identifier to delete. Limit: 1 identifier per request | +| `id` | Value of the identifier to delete (for example, `hello@example.com`) | +| `type` | Type of identifier to delete (for example, `email`, `anonymous_id`, `user_id`) | + +### Example request + +The API uses HTTP Basic Authentication. Base64-encode your access token with a trailing colon (the colon represents an empty password): + +```bash +echo -n 'your_token:' | base64 +``` + +Then send the delete request: + +```bash +curl --location --request POST 'https://profiles.segment.com/v1/spaces/spa_abc123/collections/users/profiles/user_id:user_001/external_ids/delete' \ +--header 'Authorization: Basic ' \ +--header 'Content-Type: application/json' \ +--data '{ + "delete_external_ids": [ + { + "id": "example@gmail.com", + "type": "email" + } + ] +}' +``` +## Responses and error codes + +The API returns the following HTTP status codes: + +| HTTP Code | Code | Message | +| --------- | ---------------------- | ----------------------------------------------------------------------- | +| `200` | `success` | External identifier has been deleted. | +| `400` | `unsupported_eid_type` | Unsupported external id type. | +| `400` | `bad_request` | Missing required parameters in URL. | +| `400` | `bad_request` | Invalid URL: valid `user_id` is required. Unsupported ``. | +| `400` | `bad_request` | Only one external_id can be deleted at a time. | +| `400` | `bad_request` | Invalid collection: ``. | +| `400` | `bad_request` | External id specification must differ from lookup id. | +| `401` | `unauthorized` | The specified token is invalid. | +| `403` | `forbidden` | Deleted identifier not activated for space_id ``. | +| `404` | `not_found` | The resource was not found. | +| `404` | `eid_not_found` | External identifier not found. | +| `404` | `source_id_not_found` | No source attached to space_id ``. | +| `429` | `rate_limit_error` | Attempted to delete more than 100 IDs per second for a single profile. | + +## Considerations and deletion behavior + +Keep the following information in mind as you use the Delete Profile Identifier API. + +### Deletion scope + +The Delete Profile Identifier API removes identifiers from Unify systems, including Identity Resolution, Profile Storage, and Profile Sync to your data warehouse. However, deletion doesn't extend to all Segment systems. Identifiers remain in the event archive and are soft-deleted in the Batch Profile Data Lakehouse. + +Segment doesn't delete identifiers from downstream destinations like Braze, Amplitude, Facebook, Engage Audiences, Journeys, Linked Audiences, or Consent settings. You must update these systems separately. + +### Rate limits + +Segment allows up to 100 deletion requests per second per space and 100 deletions per second for identifiers on a single profile. + +### Response time + +Most deletion requests complete in under 3 seconds. Deletions on profiles with more than 15 merges or 50 identifier mappings may take longer. + +Deletion syncs to connected systems at different speeds: + +- **Real-time Profile storage**: seconds to 5 minutes +- **Profiles Sync**: depends on your sync schedule + +### Space rebuilds and replays + +If you rebuild a space from Segment archives, deletions don't replay automatically. You must rerun deletions after the replay completes. + +### Identifier reintroduction + +Segment may reintroduce deleted identifiers in these limited cases: + +- **Event replays**: Replaying events from the Event Archive that reference deleted identifiers adds them back to the profile. +- **Engage or Journey sync timing**: Deleting an identifier within 5 minutes of sending an event that references it may result in the identifier being reintroduced through Engage-generated events. + +### Profile API source + +When you first use the Delete Profile Identifier API, Segment creates a `profile-api-source` for internal tracking. This source may appear in your workspace.