lurepaper commited on
Commit
36d2eb0
·
verified ·
1 Parent(s): 2097bc6

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +56 -0
README.md CHANGED
@@ -1,3 +1,59 @@
 
 
 
 
 
 
 
1
  # LURE 5.3
2
 
 
3
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ ---
2
+ license: apache-2.0
3
+ language:
4
+ - en
5
+ base_model:
6
+ - Qwen/Qwen3-32B
7
+ ---
8
  # LURE 5.3
9
 
10
+ This is the LURE model for Lua 5.3.
11
 
12
+ ## Usage:
13
+
14
+ ```python
15
+ from transformers import AutoTokenizer, AutoModelForCausalLM
16
+
17
+ model_id = "lurepaper/LURE_5.3"
18
+
19
+ model = AutoModelForCausalLM.from_pretrained(model_id, device_map="auto", torch_dtype="auto")
20
+ tokenizer = AutoTokenizer.from_pretrained(model_id)
21
+
22
+ prompt = "You are a Lua programming language expert. Please generate generate concise Lua code that produces the Lua 5.3 opcode OP_ADD. Use a print function call at the end to show the execution result of the opcode."
23
+
24
+ messages = [
25
+ {"role": "user", "content": prompt},
26
+ ]
27
+
28
+ text = tokenizer.apply_chat_template(
29
+ messages,
30
+ tokenize=False,
31
+ add_generation_prompt=True,
32
+ enable_thinking=True,
33
+ )
34
+
35
+ model_inputs = tokenizer([text], return_tensors="pt").to(_model.device)
36
+
37
+ generated_ids = _model.generate(
38
+ **model_inputs,
39
+ max_new_tokens=32768,
40
+ )
41
+
42
+ output_ids = generated_ids[0][len(model_inputs.input_ids[0]):].tolist()
43
+
44
+ # parsing thinking content
45
+ try:
46
+ # rindex finding 151668 (</think>)
47
+ index = len(output_ids) - output_ids[::-1].index(151668)
48
+ except ValueError:
49
+ index = 0
50
+
51
+ thinking_content = tokenizer.decode(output_ids[:index], skip_special_tokens=True).strip("\n")
52
+ content = tokenizer.decode(output_ids[index:], skip_special_tokens=True).strip("\n")
53
+
54
+ print("Thinking content:")
55
+ print(thinking_content)
56
+
57
+ print("Generated LuaGadget:")
58
+ print(content)
59
+ ```