Skip to content

chore(deps): update huggingface/skills digest to 221f5f7#548

Open
renovate[bot] wants to merge 1 commit intomainfrom
renovate/huggingface-skills-digest
Open

chore(deps): update huggingface/skills digest to 221f5f7#548
renovate[bot] wants to merge 1 commit intomainfrom
renovate/huggingface-skills-digest

Conversation

@renovate
Copy link
Copy Markdown
Contributor

@renovate renovate Bot commented Apr 22, 2026

This PR contains the following updates:

Package Update Change
huggingface/skills digest 061ab49221f5f7

Configuration

📅 Schedule: (UTC)

  • Branch creation
    • At any time (no schedule defined)
  • Automerge
    • At any time (no schedule defined)

🚦 Automerge: Disabled by config. Please merge this manually once you are satisfied.

Rebasing: Whenever PR becomes conflicted, or you tick the rebase/retry checkbox.

🔕 Ignore: Close this PR and you won't be reminded about this update again.


  • If you want to rebase/retry this PR, check this box

This PR was generated by Mend Renovate. View the repository job log.

@github-actions
Copy link
Copy Markdown

🛡️ Skill Security Scan Results

✅ hf-cli

  • Status: Passed
  • Findings: 3
  • Allowed (not blocking): 3
    • MANIFEST_MISSING_LICENSE (Allowed: huggingface/skills is licensed Apache-2.0 at the repository root; upstream does not embed an SPDX license identifier in per-skill SKILL.md frontmatter.)
    • PIPELINE_TAINT_FLOW (Allowed: The skill's prerequisites cite the official hf CLI installer (curl -LsSf https://hf.co/cli/install.sh | bash) and the hf-mount installer (curl -fsSL https://raw.githubusercontent.com/huggingface/hf-mount/main/install.sh | sh) as documented install commands. The scanner itself flags both as 'instructional install text in SKILL.md'.)
    • PIPELINE_TAINT_FLOW (Allowed: The skill's prerequisites cite the official hf CLI installer (curl -LsSf https://hf.co/cli/install.sh | bash) and the hf-mount installer (curl -fsSL https://raw.githubusercontent.com/huggingface/hf-mount/main/install.sh | sh) as documented install commands. The scanner itself flags both as 'instructional install text in SKILL.md'.)

✅ hf-mcp

  • Status: Passed
  • Findings: 1
  • Allowed (not blocking): 1
    • MANIFEST_MISSING_LICENSE (Allowed: huggingface/skills is licensed Apache-2.0 at the repository root; upstream does not embed an SPDX license identifier in per-skill SKILL.md frontmatter.)

✅ huggingface-community-evals

  • Status: Passed
  • Findings: 1
  • Allowed (not blocking): 1
    • MANIFEST_MISSING_LICENSE (Allowed: huggingface/skills is licensed Apache-2.0 at the repository root; upstream does not embed an SPDX license identifier in per-skill SKILL.md frontmatter.)

✅ huggingface-datasets

  • Status: Passed
  • Findings: 1
  • Allowed (not blocking): 1
    • MANIFEST_MISSING_LICENSE (Allowed: huggingface/skills is licensed Apache-2.0 at the repository root; upstream does not embed an SPDX license identifier in per-skill SKILL.md frontmatter.)

✅ huggingface-gradio

  • Status: Passed
  • Findings: 1
  • Allowed (not blocking): 1
    • MANIFEST_MISSING_LICENSE (Allowed: huggingface/skills is licensed Apache-2.0 at the repository root; upstream does not embed an SPDX license identifier in per-skill SKILL.md frontmatter.)

✅ huggingface-llm-trainer

  • Status: Passed
  • Findings: 11
  • Allowed (not blocking): 11
    • TOOL_ABUSE_UNDECLARED_NETWORK (Allowed: The skill orchestrates training jobs on Hugging Face Jobs cloud GPUs via the HF MCP server's hf_jobs tool. The network requirement is through the HF MCP server dependency (packaged in toolhive-catalog under registries/official/servers/huggingface), not a direct network-access tool in frontmatter.)
    • SOCIAL_ENG_MISLEADING_DESC (Allowed: Scanner heuristic flags the broad scope of the description (training SFT/DPO/GRPO + GGUF conversion + monitoring + etc.) as 'performing actions not reflected in description'. The description accurately reflects the skill's documented scope; the flag is a scanner conservatism false positive.)
    • TOOL_ABUSE_SYSTEM_PACKAGE_INSTALL (Allowed: The bundled scripts/convert_to_gguf.py references sudo apt-get install / sudo yum install for optional system packages (build tools) when converting trained models to GGUF format. These run in ephemeral HF Jobs containers, not on the user's host. The script is HF-authored and documented in SKILL.md.)
    • TOOL_ABUSE_SYSTEM_PACKAGE_INSTALL (Allowed: The bundled scripts/convert_to_gguf.py references sudo apt-get install / sudo yum install for optional system packages (build tools) when converting trained models to GGUF format. These run in ephemeral HF Jobs containers, not on the user's host. The script is HF-authored and documented in SKILL.md.)
    • TOOL_ABUSE_SYSTEM_PACKAGE_INSTALL (Allowed: The bundled scripts/convert_to_gguf.py references sudo apt-get install / sudo yum install for optional system packages (build tools) when converting trained models to GGUF format. These run in ephemeral HF Jobs containers, not on the user's host. The script is HF-authored and documented in SKILL.md.)
    • TOOL_ABUSE_SYSTEM_PACKAGE_INSTALL (Allowed: The bundled scripts/convert_to_gguf.py references sudo apt-get install / sudo yum install for optional system packages (build tools) when converting trained models to GGUF format. These run in ephemeral HF Jobs containers, not on the user's host. The script is HF-authored and documented in SKILL.md.)
    • TOOL_ABUSE_SYSTEM_PACKAGE_INSTALL (Allowed: The bundled scripts/convert_to_gguf.py references sudo apt-get install / sudo yum install for optional system packages (build tools) when converting trained models to GGUF format. These run in ephemeral HF Jobs containers, not on the user's host. The script is HF-authored and documented in SKILL.md.)
    • TOOL_ABUSE_SYSTEM_PACKAGE_INSTALL (Allowed: The bundled scripts/convert_to_gguf.py references sudo apt-get install / sudo yum install for optional system packages (build tools) when converting trained models to GGUF format. These run in ephemeral HF Jobs containers, not on the user's host. The script is HF-authored and documented in SKILL.md.)
    • DATA_EXFIL_NETWORK_REQUESTS (Allowed: Bundled helper scripts (scripts/dataset_inspector.py, scripts/hf_benchmarks.py) use urllib.request to query the public Hugging Face Hub API for dataset validation and benchmark lookups — documented workflow steps required by the skill.)
    • DATA_EXFIL_NETWORK_REQUESTS (Allowed: Bundled helper scripts (scripts/dataset_inspector.py, scripts/hf_benchmarks.py) use urllib.request to query the public Hugging Face Hub API for dataset validation and benchmark lookups — documented workflow steps required by the skill.)
    • DATA_EXFIL_NETWORK_REQUESTS (Allowed: Bundled helper scripts (scripts/dataset_inspector.py, scripts/hf_benchmarks.py) use urllib.request to query the public Hugging Face Hub API for dataset validation and benchmark lookups — documented workflow steps required by the skill.)

✅ huggingface-paper-publisher

  • Status: Passed
  • Findings: 6
  • Allowed (not blocking): 6
    • TOOL_ABUSE_UNDECLARED_NETWORK (Allowed: The skill uses network access through its bundled paper_manager.py script (as its documented workflow), but does not declare an explicit network-access tool in frontmatter. All network calls target the public Hugging Face Hub API documented in the SKILL.md.)
    • MANIFEST_MISSING_LICENSE (Allowed: huggingface/skills is licensed Apache-2.0 at the repository root; upstream does not embed an SPDX license identifier in per-skill SKILL.md frontmatter.)
    • DATA_EXFIL_NETWORK_REQUESTS (Allowed: scripts/paper_manager.py uses requests.get() to query the public Hugging Face Hub API (api.huggingface.co) for paper metadata — the skill's entire purpose. The destinations are the official HF API endpoints documented in the SKILL.md workflow.)
    • DATA_EXFIL_NETWORK_REQUESTS (Allowed: scripts/paper_manager.py uses requests.get() to query the public Hugging Face Hub API (api.huggingface.co) for paper metadata — the skill's entire purpose. The destinations are the official HF API endpoints documented in the SKILL.md workflow.)
    • DATA_EXFIL_NETWORK_REQUESTS (Allowed: scripts/paper_manager.py uses requests.get() to query the public Hugging Face Hub API (api.huggingface.co) for paper metadata — the skill's entire purpose. The destinations are the official HF API endpoints documented in the SKILL.md workflow.)
    • FILE_MAGIC_MISMATCH (Allowed: templates/modern.md is a paper template that legitimately uses Handlebars-style {{}} substitution syntax. Magika detects the Handlebars markers and flags the format mismatch; the file is plain text documentation and safe.)

✅ huggingface-papers

  • Status: Passed
  • Findings: 1
  • Allowed (not blocking): 1
    • MANIFEST_MISSING_LICENSE (Allowed: huggingface/skills is licensed Apache-2.0 at the repository root; upstream does not embed an SPDX license identifier in per-skill SKILL.md frontmatter.)

✅ huggingface-tool-builder

  • Status: Passed
  • Findings: 5
  • Allowed (not blocking): 5
    • LOW_ANALYZABILITY (Allowed: The scanner reports that 1 of 8 files is opaque — this is an executable .tsx reference script (references/baseline_hf_api.tsx) that the scanner cannot fully parse. The file is a small, HF-authored example script and is readable markdown-adjacent TypeScript.)
    • TOOL_ABUSE_UNDECLARED_NETWORK (Allowed: The skill uses network access through its bundled reference scripts that call the public Hugging Face Hub API. The frontmatter does not declare a dedicated network-access tool, but the network calls are documented examples bundled for user education, not runtime execution by the skill itself.)
    • MANIFEST_MISSING_LICENSE (Allowed: huggingface/skills is licensed Apache-2.0 at the repository root; upstream does not embed an SPDX license identifier in per-skill SKILL.md frontmatter.)
    • DATA_EXFIL_NETWORK_REQUESTS (Allowed: The skill's baseline reference scripts (references/baseline_hf_api.py) use urllib.request.Request/urlopen against the public Hugging Face Hub API (api.huggingface.co). Teaching users to build HF API-consuming tools is the skill's entire purpose.)
    • DATA_EXFIL_NETWORK_REQUESTS (Allowed: The skill's baseline reference scripts (references/baseline_hf_api.py) use urllib.request.Request/urlopen against the public Hugging Face Hub API (api.huggingface.co). Teaching users to build HF API-consuming tools is the skill's entire purpose.)

✅ huggingface-trackio

  • Status: Passed
  • Findings: 1
  • Allowed (not blocking): 1
    • MANIFEST_MISSING_LICENSE (Allowed: huggingface/skills is licensed Apache-2.0 at the repository root; upstream does not embed an SPDX license identifier in per-skill SKILL.md frontmatter.)

✅ huggingface-vision-trainer

  • Status: Passed
  • Findings: 4
  • Allowed (not blocking): 4
    • TOOL_ABUSE_UNDECLARED_NETWORK (Allowed: The skill orchestrates vision training jobs on Hugging Face Jobs cloud GPUs via the HF MCP server's hf_jobs tool. The network requirement is through the HF MCP server dependency (packaged in toolhive-catalog under registries/official/servers/huggingface), not a direct network-access tool in frontmatter.)
    • SOCIAL_ENG_MISLEADING_DESC (Allowed: Scanner heuristic flags the breadth of the description (object detection + image classification + SAM/SAM2 segmentation) as 'performing actions not reflected in description'. The description accurately reflects the skill's documented scope; the flag is a scanner conservatism false positive.)
    • MANIFEST_MISSING_LICENSE (Allowed: huggingface/skills is licensed Apache-2.0 at the repository root; upstream does not embed an SPDX license identifier in per-skill SKILL.md frontmatter.)
    • DATA_EXFIL_NETWORK_REQUESTS (Allowed: The bundled scripts/dataset_inspector.py uses urllib.request.urlopen() to query the public Hugging Face Hub API for dataset format validation — a documented workflow step required before launching GPU training.)

✅ transformers-js

  • Status: Passed
  • Findings: 0

Summary: Scanned 12 skill(s), all passed security checks. ✅

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Projects

None yet

Development

Successfully merging this pull request may close these issues.

0 participants