Skip to content

fix(seo): move top-level robots into metadata block (2 files)#25

Open
bblietz wants to merge 1 commit intorokudev:v2.0from
bblietz:seo/pr0a-top-level-robots
Open

fix(seo): move top-level robots into metadata block (2 files)#25
bblietz wants to merge 1 commit intorokudev:v2.0from
bblietz:seo/pr0a-top-level-robots

Conversation

@bblietz
Copy link
Copy Markdown

@bblietz bblietz commented May 7, 2026

What

Two files had a top-level robots: index key and no metadata: block, diverging from the canonical frontmatter shape used in 547 of 582 docs pages:

  • docs/The Roku Channel/video-on-demand/title-avail-specifications.md
  • docs/The Roku Channel/live-linear/ovp-linear-ingest-spec.md

This PR moves robots into a stub metadata: block in the canonical key-order position, so the SEO frontmatter backfill pipeline can populate metadata.title/metadata.description without producing duplicate robots: keys.

Diff

2 files changed, 8 insertions(+), 2 deletions(-). Each file: removed the top-level robots: index line; inserted a 4-line stub metadata: block in canonical position (after hidden:, before any closing). No other content modified.

Verification

Before: top-level robots: present, no metadata: block. The SEO snapshot tool flagged both as top_level_robots without metadata block.
After: top-level robots: removed, metadata: block in canonical position with robots: index inside. Both files match the canonical 547-of-582 frontmatter shape.

Context

This is PR #0a of an SEO frontmatter backfill effort that will populate metadata.title, metadata.description, and excerpt across all ~582 docs pages over a series of follow-up PRs. PR #0a fixes a corpus pre-condition (alongside PR #0 which removed a duplicate frontmatter block in ad-requirements.md).

🤖 Generated with Claude Code

These 2 files had top-level `robots: index` and no `metadata:` block,
diverging from the canonical frontmatter shape used in 547 of 582 docs.
Move `robots` into a stub `metadata:` block so the SEO frontmatter
backfill pipeline can populate `metadata.title`/`metadata.description`
without producing duplicate `robots:` keys (PR #0a of the SEO sweep).
@bblietz
Copy link
Copy Markdown
Author

bblietz commented May 7, 2026

Closing as redundant: the top-level robots: cleanup is already reflected on v2.0 (no top-level robots: key in the affected files). Some files this PR touched have also been deleted upstream, making the diff stale.

@bblietz bblietz closed this May 7, 2026
@bblietz bblietz reopened this May 7, 2026
@bblietz
Copy link
Copy Markdown
Author

bblietz commented May 7, 2026

Re-opened: I closed this prematurely. The 2 Roku Channel files (ovp-linear-ingest-spec.md and title-avail-specifications.md) still have top-level robots: keys on v2.0; only the unrelated upstream changes touched files that this PR's stale base showed as deletions. The actual commit (1a760f5) only modifies those 2 files and the change is still needed for the SEO backfill pipeline.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

1 participant