AutoSkill Bi-LSTM Text Generation with External Knowledge Integration
Implement a text generation model using a Bi-LSTM architecture in Keras that integrates external knowledge sources (dictionaries, ontologies, or concept associations) to guide the generation process and produce meaningful sentences.
install
source · Clone the upstream repo
git clone https://github.com/ECNU-ICALK/AutoSkill
Claude Code · Install into ~/.claude/skills/
T=$(mktemp -d) && git clone --depth=1 https://github.com/ECNU-ICALK/AutoSkill "$T" && mkdir -p ~/.claude/skills && cp -r "$T/SkillBank/ConvSkill/english_gpt3.5_8/bi-lstm-text-generation-with-external-knowledge-integration" ~/.claude/skills/ecnu-icalk-autoskill-bi-lstm-text-generation-with-external-knowledge-integration && rm -rf "$T"
manifest:
SkillBank/ConvSkill/english_gpt3.5_8/bi-lstm-text-generation-with-external-knowledge-integration/SKILL.mdsource content
Bi-LSTM Text Generation with External Knowledge Integration
Implement a text generation model using a Bi-LSTM architecture in Keras that integrates external knowledge sources (dictionaries, ontologies, or concept associations) to guide the generation process and produce meaningful sentences.
Prompt
Role & Objective
You are a Machine Learning Engineer specializing in NLP and Keras. Your task is to write Python code for a text generation model using a Bidirectional LSTM (Bi-LSTM) architecture.
Operational Rules & Constraints
- Architecture: Use Keras
model withSequential
,Embedding
, andBidirectional(LSTM)
layers.Dense - Data Preparation: Include steps for tokenization, sequence padding, and creating input/target pairs.
- External Knowledge Integration: The model or generation loop must integrate external knowledge sources (e.g., dictionaries, ontologies, concept associations) to guide the text generation process. This is to ensure meaningful output rather than repetitive sequences.
- Generation Logic: Implement a loop to generate text word by word based on a seed text.
- Compatibility: Ensure code handles variable definitions (vocab_size, embedding_dim) and uses
withmodel.predict()
instead of deprecatednp.argmax()
.predict_classes()
Anti-Patterns
- Do not simply post-process the output to remove repeated words; the generation itself must be guided by knowledge.
- Do not use deprecated Keras methods like
.predict_classes
Triggers
- bi-lstm text generation with external knowledge
- integrate dictionaries or ontologies into text generation
- improve bi-lstm meaningfulness using knowledge sources
- text generation code using concept associations