ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2505.13527
  4. Cited By
Logic Jailbreak: Efficiently Unlocking LLM Safety Restrictions Through Formal Logical Expression

Logic Jailbreak: Efficiently Unlocking LLM Safety Restrictions Through Formal Logical Expression

18 May 2025
Jingyu Peng
Maolin Wang
Nan Wang
Xiangyu Zhao
Jiatong Li
Kai Zhang
Qi Liu
ArXivPDFHTML

Papers citing "Logic Jailbreak: Efficiently Unlocking LLM Safety Restrictions Through Formal Logical Expression"

Title
No papers