apollo-link-scalars
    Preparing search index...

    apollo-link-scalars

    apollo-link-scalars

    All Contributors

    npm version Build Status codebeat badge Maintainability Test Coverage

    TypeDoc generated docs in here

    Github repo here

    Custom Apollo Link to parse custom scalars from GraphQL responses and serialize them back in variables. It also validates enums, can strip __typename from inputs, and now includes a cache rehydration helper for JSON-persisted Apollo caches. See Usage, Options, and Rehydrating a persisted cache (reviveScalarsInCache).

    The deprecated Apollo Client v2 is used in the 0.x branch.

    Of the 0.x family, the versions 0.1.x and 0.2.x are deprecated and a migration to 0.3.x is recommended

    apollo-link-scalars v5+ supports both Apollo Client v3 and Apollo Client v4.

    The 1.x family is considered deprecated and a migration to 2.x or greater is recommended

    • @apollo/client v4 support alongside the existing v3 support. The peerDependencies range is now 3.x || 4.x.
    • New reviveScalarsInCache helper for re-applying custom parseValue to a JSON-restored Apollo cache. See Rehydrating a persisted cache (reviveScalarsInCache).
    • No source-level breaking changes for code already using withScalars on 4.x. Upgrading from 4.0.3 to 5.x is a drop-in bump.

    The versions that included makeExecutableSchema from graphql-tools are deprecated. This are the versions:

    • 0.1.x and 0.2.x => please migrate to 0.3.x (apollo client v2 line, deprecated)
    • 1.x => please migrate to 2.x (apollo client v3 line)

    If you are not using makeExecutableSchema from this library, the upgrade will be transparent.

    If you are using makeExecutableSchema, you just need to replace it from the version of graphql-tools compatible with the version of Apollo Client that you are using. Please have a look at the Example of loading a schema

    Parsing scalars at link level means that Apollo cache will receive them already parsed. Depending on what kind of parsing is performed, this may interact with the cache JSON serialization of, for example,apollo-cache-persist. While apollo-cache-persist has an option to turn that serialisation off, others may have similar issues.

    In the original Apollo Client Github issue thread about scalar parsing, this situation was discussed.

    Apollo Client still does not support this natively. The original 2016 ticket was closed in 2018 as a housekeeping redirect to apollographql/apollo-feature-requests#368, which has been open ever since. A potential solution of parsing after the cache might have some other issues, like returning different instances for the cached data, which may not be ideal in some situations that rely on that (e.g. react re-render control). I think some users will benefit more from the automatic parsing and serializing than the cost of the potential cache interactions.

    UPDATE: @woltob surfaced the JSON-backed persistence case in issue #760. The reviveScalarsInCache helper documented below is available in apollo-link-scalars v5+.

    Install the library together with graphql, plus the Apollo Client version your app already uses.

    pnpm add apollo-link-scalars graphql @apollo/client
    

    Use apollo-link-scalars v5+ if you are on @apollo/client v3 or v4.

    • Parses custom scalar fields in GraphQL responses by walking the query result with your schema.
    • Serializes custom scalar input values before they are sent over the network.
    • Lets typesMap overrides win over schema scalar implementations when you need app-specific behavior.
    • Optionally validates enum values and removes __typename from inputs.
    • Rehydrates parsed scalar values back into a JSON-restored Apollo cache with reviveScalarsInCache in v5+.

    This repository includes small React/Vite apps that demonstrate the main supported scenarios:

    At runtime you provide:

    • a GraphQLSchema
    • optionally, a typesMap with custom parseValue / serialize functions
    • optionally, behavior flags such as enum validation, __typename stripping, and nullFunctions

    Build the link with withScalars() and place it before your HTTP link.

    import { withScalars } from "apollo-link-scalars";
    import { ApolloClient, ApolloLink, HttpLink, InMemoryCache } from "@apollo/client/core";
    import { schema } from "./my-schema";

    const httpLink = new HttpLink({ uri: "http://example.org/graphql" });

    const client = new ApolloClient({
    cache: new InMemoryCache(),
    link: ApolloLink.from([withScalars({ schema }), httpLink]),
    });

    You can override specific scalar parsing or serialization rules with typesMap. These functions take priority over any scalar implementation already present in the schema.

    import { withScalars } from "apollo-link-scalars";
    import { ApolloLink, HttpLink } from "@apollo/client/core";
    import { isString } from "es-toolkit";
    import { schema } from "./my-schema";

    const typesMap = {
    CustomScalar: {
    serialize: (parsed: unknown): string | null => (parsed instanceof CustomScalar ? parsed.toString() : null),
    parseValue: (raw: unknown): CustomScalar | null => {
    if (!raw) return null; // if for some reason we want to treat empty string as null, for example
    if (isString(raw)) {
    return new CustomScalar(raw);
    }

    throw new Error("invalid value to parse");
    },
    },
    };

    const link = ApolloLink.from([withScalars({ schema, typesMap }), new HttpLink({ uri: "http://example.org/graphql" })]);

    withScalars() accepts these extra options:

    • removeTypenameFromInputs (Boolean, default false): when enabled, it will remove from the inputs the __typename if it is found. This could be useful if we are using data received from a query as an input on another query.
    • validateEnums (Boolean, default false): when enabled, it will validate the enums on parsing, throwing an error if it sees a value that is not one of the enum values.
    • nullFunctions (NullFunctions, optional): by passing a set of transforms on how to box and unbox null types, you can automatically construct e.g. Maybe monads from null values. See Changing the behaviour of nullable types.
    withScalars({
    schema,
    typesMap,
    validateEnums: true,
    removeTypenameFromInputs: true,
    });

    This is the usual shape in an application:

    import { ApolloClient, ApolloLink, HttpLink, InMemoryCache } from "@apollo/client";
    import { withScalars } from "apollo-link-scalars";
    import { schema, typesMap } from "./graphql/scalars";

    const cache = new InMemoryCache();
    const httpLink = new HttpLink({ uri: "/graphql" });

    export const client = new ApolloClient({
    cache,
    link: ApolloLink.from([withScalars({ schema, typesMap, validateEnums: true }), httpLink]),
    });
    import { gql } from "@apollo/client/core";
    import { GraphQLScalarType, Kind } from "graphql";
    import { makeExecutableSchema } from "@graphql-tools/schema";

    // GraphQL Schema definition.
    const typeDefs = gql`
    type Query {
    myList: [MyObject!]!
    }

    type MyObject {
    day: Date
    days: [Date]!
    nested: MyObject
    }

    "represents a Date with time"
    scalar Date
    `;

    const resolvers = {
    // example of scalar type, which will parse the string into a custom class CustomDate which receives a Date object
    Date: new GraphQLScalarType({
    name: "Date",
    serialize: (parsed: CustomDate | null) => parsed && parsed.toISOString(),
    parseValue: (raw: any) => raw && new CustomDate(new Date(raw)),
    parseLiteral(ast) {
    if (ast.kind === Kind.STRING || ast.kind === Kind.INT) {
    return new CustomDate(new Date(ast.value));
    }
    return null;
    },
    }),
    };

    // GraphQL Schema, required to use the link
    const schema = makeExecutableSchema({
    typeDefs,
    resolvers,
    });

    Warning: Be sure to watch your bundle size and know what you are doing.

    Codegen config to generate introspection data:

    codegen.yml

    ---
    generates:
    src/__generated__/graphql.schema.json:
    plugins:
    - "introspection"
    config:
    minify: true

    Synchronous code to create link instance in common scenario:

    import introspectionResult from "./__generated__/graphql.schema.json";
    import { buildClientSchema, IntrospectionQuery } from "graphql";

    const schema = buildClientSchema(introspectionResult)
    // note: sometimes it seems to be needed to cast it as Introspection Query
    // `const schema = buildClientSchema((introspectionResult as unknown) as IntrospectionQuery)`

    const scalarsLink = withScalars({
    schema,
    typesMap: { … },
    });

    By passing the nullFunctions parameter to withScalars, you can change the way nullable types are handled. The default implementation leaves them as-is, i.e. null => null and value => value. If instead you want to transform nulls into a Maybe monad, you can supply functions corresponding to the following type. The examples below are based on the Maybe monad from Seidr, but any implementation will do.

    type NullFunctions = {
    serialize(input: any): any | null;
    parseValue(raw: any | null): any;
    };

    const nullFunctions: NullFunctions = {
    parseValue(raw: any) {
    if (isNone(raw)) {
    return Nothing();
    } else {
    return Just(raw);
    }
    },
    serialize(input: any) {
    return input.caseOf({
    Just(value) {
    return value;
    },
    Nothing() {
    return null;
    },
    });
    },
    };

    The nullFunctions are executed after the normal parsing/serializing. The normal parsing/serializing functions are not called for null values.

    Both in parsing and serializing, we have the following logic (in pseudocode):

    if (isNone(value)) {
    return this.nullFunctions.serialize(value);
    }

    const serialized = serializeNonNullValue(value);
    return this.nullFunctions.serialize(serialized);
    if (isNone(value)) {
    return this.nullFunctions.parseValue(value);
    }

    const parsed = parseNonNullValue(value);
    return this.nullFunctions.parseValue(parsed);

    withScalars runs inside the Apollo link chain, so it only parses operations flowing through the network. If you persist the Apollo cache with a JSON-backed store — apollo3-cache-persist, AsyncStorage, Redux-Persist, a custom adapter — the cache entries come back from storage as the shape JSON can hold: a custom DateTime becomes an ISO string, a custom Money becomes whatever serialize emitted, etc. The link never runs on rehydration, so the consumer never sees the parsed types. This is issue #760.

    reviveScalarsInCache is a pure, schema-driven helper that fixes this. Call it on the extracted cache snapshot to re-apply the custom parseValue functions to every scalar field declared in the schema, then hand the result back to cache.restore.

    import { reviveScalarsInCache, withScalars } from "apollo-link-scalars";
    import { LocalStorageWrapper, persistCache } from "apollo3-cache-persist";

    const cache = new InMemoryCache();

    await persistCache({ cache, storage: new LocalStorageWrapper(window.localStorage) });

    // `persistCache` has just repopulated the cache from storage. Revive the
    // snapshot so downstream cache reads see parsed scalars again.
    cache.restore(reviveScalarsInCache(cache.extract(), { schema, typesMap }));

    const client = new ApolloClient({
    cache,
    link: ApolloLink.from([withScalars({ schema, typesMap }), httpLink]),
    });

    Works with any JSON-backed store, including ones that hand you the raw payload directly:

    const raw = JSON.parse(await AsyncStorage.getItem("apollo-cache"));
    cache.restore(reviveScalarsInCache(raw, { schema, typesMap }));

    Use the same schema, typesMap, and nullFunctions you already use in withScalars so network responses and cache rehydration produce the same shapes.

    Options:

    • schema (required) — the same GraphQLSchema you pass to withScalars.
    • typesMap (required) — the same map you pass to withScalars. Entries here win over any parseValue defined on the schema scalar. Leaf types defined only on the schema are still applied (same merge behavior as withScalars).
    • nullFunctions (optional) — pass the same transform you pass to withScalars if you're boxing nullable values into a Maybe monad; nullable fields are wrapped through it on rehydration, matching what the link produces on the network path. Defaults to identity.

    Caveats:

    • Mutates the passed snapshot in place and returns the same reference. Pass a fresh object such as cache.extract() or a JSON.parse(...) result, not a live structure shared with the rest of the app.
    • Requires __typename on embedded non-normalized objects (Apollo's default — new InMemoryCache() adds it). Caches built with addTypename: false skip embedded object revival because there is no typename to look up in the schema. Top-level normalized entities still work because their __typename is part of the cache key Apollo writes regardless.
    • Interfaces, unions, and enum-scalar validation are out of scope in this first pass. Scalar fields nested under an interface- or union-typed field are not revived because the helper does not resolve the runtime __typename on the value itself the way the parser does.
    • Idempotence is caller-contingent. If you run the helper twice on the same snapshot, every scalar's parseValue runs twice. Safe only when parseValue detects its own output and short-circuits — e.g. (v) => typeof v === "string" ? new Date(v) : v leaves Date instances alone on a second pass. A naive (v) => Number(v) * 100 will silently corrupt a second call (150 -> 15000).

    The link code is heavily based on apollo-link-response-resolver by will-heart.

    While the approach in apollo-link-response-resolver is to apply resolvers based on the types taken from __typename, this follows the query and the schema to parse based on scalar types. Note that apollo-link-response-resolver is archived now

    I started working on this after following the Apollo feature request https://github.com/apollographql/apollo-feature-requests/issues/2.

    See documentation for development

    For the current release checklist, CI publishing setup, and npm trusted publishing workflow, see RELEASING.md.

    Commits should follow the Conventional Commits format. The repository enforces this with commitlint, and commit-and-tag-version uses those commit messages to determine the version bump and generate CHANGELOG.md.

    If you want help composing a compliant commit message, use commitizen:

    # one-off interactive commit message helper
    pnpm dlx git-cz

    This project uses commit-and-tag-version for release commits, tags, and changelog generation.

    # bump package.json version, update CHANGELOG.md, git tag the release
    pnpm version

    You may find a tool like wip helpful for managing work in progress before you're ready to create a meaningful commit.

    The canonical release process now lives in RELEASING.md. In short:

    • verify locally with pnpm test:full and pnpm e2e:run
    • run pnpm version to create the release commit, changelog update, and tag
    • push with git push --follow-tags origin <release-branch>
    • let GitHub Actions publish the package to npm via trusted publishing

    See RELEASING.md for --first-release, --prerelease, and --sign flags.

    pnpm doc:html && pnpm doc:publish
    

    This will generate the docs and publish them in Github pages.

    There is a single command for preparing a release candidate locally:

    # Prepare a standard release
    pnpm prepare-release

    # Push to git
    git push --follow-tags origin <release-branch>

    Thanks goes to these wonderful people (emoji key):

    Eduardo Turiño
    Eduardo Turiño

    🤔 🚇 ⚠️ 💻 📖
    Genadi Samokovarov
    Genadi Samokovarov

    🐛 ⚠️ 💻
    Jiří Brabec
    Jiří Brabec

    📖 🐛 ⚠️ 💻 🤔
    Jakub Petriska
    Jakub Petriska

    🐛
    Deyan Dobrinov
    Deyan Dobrinov

    🐛 🤔
    Hugh Barrigan
    Hugh Barrigan

    ⚠️ 💻 🤔
    Jeff Lau
    Jeff Lau

    📖
    Florian Cargoët
    Florian Cargoët

    🐛
    Jaff Parker
    Jaff Parker

    🐛 🚇
    Kenneth
    Kenneth

    📖
    Sir.Nathan (Jonathan Stassen)
    Sir.Nathan (Jonathan Stassen)

    🚇 📦
    Robert Schadek
    Robert Schadek

    🚇 💻 📦

    This project follows the all-contributors specification. Contributions of any kind welcome!