/[suikacvs]/markup/html/whatpm/t/tokenizer-result.txt
Suika

Contents of /markup/html/whatpm/t/tokenizer-result.txt

Parent Directory Parent Directory | Revision Log Revision Log


Revision 1.318 - (hide annotations) (download)
Sat Sep 5 11:31:58 2009 UTC (15 years, 10 months ago) by wakaba
Branch: MAIN
CVS Tags: HEAD
Changes since 1.317: +96 -19 lines
File MIME type: text/plain
++ whatpm/t/ChangeLog	5 Sep 2009 11:31:07 -0000
	* tokenizer-test-1.test: Changed to keep non-normal character
	references (HTML5 revision 3374).

2009-09-05  Wakaba  <wakaba@suika.fam.cx>

++ whatpm/Whatpm/HTML/ChangeLog	5 Sep 2009 11:31:46 -0000
	* Tokenizer.pm.src: Changed to keep non-normal character
	references as is (HTML5 revision 3374).

2009-09-05  Wakaba  <wakaba@suika.fam.cx>

1 wakaba 1.287 1..1129
2 wakaba 1.273 # Running under perl version 5.010000 for linux
3 wakaba 1.318 # Current time local: Sat Sep 5 20:30:33 2009
4     # Current time GMT: Sat Sep 5 11:30:33 2009
5 wakaba 1.1 # Using Test.pm version 1.25
6 wakaba 1.11 # t/tokenizer/test1.test
7 wakaba 1.20 ok 1
8 wakaba 1.298 not ok 2
9     # Test 2 got: "$VAR1 = [\n [\n qq'DOCTYPE',\n qq'html',\n undef,\n undef,\n 1\n ]\n ];\n" (t/HTML-tokenizer.t at line 205 fail #2)
10     # Expected: "$VAR1 = [\n [\n qq'DOCTYPE',\n qq'HTML',\n undef,\n undef,\n 1\n ]\n ];\n" (Correct Doctype uppercase: qq'<!DOCTYPE HTML>')
11     # Line 4 is changed:
12     # - " qq'HTML',\n"
13     # + " qq'html',\n"
14     # t/HTML-tokenizer.t line 205 is: ok $parser_dump, $expected_dump,
15     not ok 3
16     # Test 3 got: "$VAR1 = [\n [\n qq'DOCTYPE',\n qq'html',\n undef,\n undef,\n 1\n ]\n ];\n" (t/HTML-tokenizer.t at line 205 fail #3)
17     # Expected: "$VAR1 = [\n [\n qq'DOCTYPE',\n qq'HtMl',\n undef,\n undef,\n 1\n ]\n ];\n" (Correct Doctype mixed case: qq'<!DOCTYPE HtMl>')
18     # Line 4 is changed:
19     # - " qq'HtMl',\n"
20     # + " qq'html',\n"
21 wakaba 1.1 ok 4
22 wakaba 1.20 ok 5
23 wakaba 1.1 ok 6
24     ok 7
25     ok 8
26     ok 9
27     ok 10
28     ok 11
29     ok 12
30     ok 13
31     ok 14
32 wakaba 1.130 ok 15
33 wakaba 1.1 ok 16
34     ok 17
35     ok 18
36 wakaba 1.296 not ok 19
37     # Test 19 got: "$VAR1 = [\n [\n qq'Comment',\n qq' --comment '\n ]\n ];\n" (t/HTML-tokenizer.t at line 205 fail #19)
38     # Expected: "$VAR1 = [\n qq'ParseError',\n [\n qq'Comment',\n qq' --comment '\n ]\n ];\n" (Comment, two central dashes: qq'<!-- --comment -->')
39     # Line 2 is missing:
40     # - " qq'ParseError',\n"
41 wakaba 1.1 ok 20
42     ok 21
43 wakaba 1.25 ok 22
44     ok 23
45 wakaba 1.1 ok 24
46 wakaba 1.22 ok 25
47     ok 26
48     ok 27
49 wakaba 1.1 ok 28
50     ok 29
51     ok 30
52     ok 31
53     ok 32
54     ok 33
55 wakaba 1.18 ok 34
56 wakaba 1.1 ok 35
57     ok 36
58     ok 37
59 wakaba 1.8 ok 38
60 wakaba 1.28 ok 39
61     ok 40
62 wakaba 1.43 ok 41
63     ok 42
64 wakaba 1.286 ok 43
65 wakaba 1.11 # t/tokenizer/test2.test
66 wakaba 1.286 not ok 44
67     # Test 44 got: "$VAR1 = [\n qq'ParseError',\n qq'ParseError',\n [\n qq'DOCTYPE',\n undef,\n undef,\n undef,\n 0\n ]\n ];\n" (t/HTML-tokenizer.t at line 205 fail #44)
68 wakaba 1.47 # Expected: "$VAR1 = [\n qq'ParseError',\n qq'ParseError',\n [\n qq'DOCTYPE',\n qq'',\n undef,\n undef,\n 0\n ]\n ];\n" (DOCTYPE without name: qq'<!DOCTYPE>')
69 wakaba 1.20 # Line 6 is changed:
70 wakaba 1.8 # - " qq'',\n"
71 wakaba 1.20 # + " undef,\n"
72     ok 45
73     ok 46
74     ok 47
75     ok 48
76     ok 49
77     ok 50
78     ok 51
79 wakaba 1.97 ok 52
80     ok 53
81     ok 54
82     ok 55
83 wakaba 1.9 ok 56
84     ok 57
85 wakaba 1.1 ok 58
86     ok 59
87     ok 60
88 wakaba 1.19 ok 61
89 wakaba 1.318 not ok 62
90     # Test 62 got: "$VAR1 = [\n qq'ParseError',\n [\n qq'Character',\n qq'\\x{D869}'\n ],\n qq'ParseError',\n [\n qq'Character',\n qq'\\x{DED6}'\n ]\n ];\n" (t/HTML-tokenizer.t at line 205 fail #62)
91     # Expected: "$VAR1 = [\n qq'ParseError',\n [\n qq'Character',\n qq'\\x{FFFD}'\n ],\n qq'ParseError',\n [\n qq'Character',\n qq'\\x{FFFD}'\n ]\n ];\n" (Hexadecimal entity pair representing a surrogate pair: qq'&#xD869;&#xDED6;')
92     # Line 5 is changed:
93     # - " qq'\\x{FFFD}'\n"
94     # + " qq'\\x{D869}'\n"
95     # Line 10 is changed:
96     # - " qq'\\x{FFFD}'\n"
97     # + " qq'\\x{DED6}'\n"
98 wakaba 1.1 ok 63
99 wakaba 1.130 ok 64
100 wakaba 1.1 ok 65
101 wakaba 1.240 ok 66
102     ok 67
103     ok 68
104 wakaba 1.1 ok 69
105     ok 70
106 wakaba 1.34 ok 71
107     ok 72
108 wakaba 1.1 ok 73
109     ok 74
110 wakaba 1.21 ok 75
111     ok 76
112 wakaba 1.1 ok 77
113 wakaba 1.141 ok 78
114 wakaba 1.1 ok 79
115 wakaba 1.305 ok 80
116 wakaba 1.34 ok 81
117 wakaba 1.286 # t/tokenizer/test3.test
118 wakaba 1.15 ok 82
119 wakaba 1.1 ok 83
120     ok 84
121 wakaba 1.25 ok 85
122     ok 86
123 wakaba 1.34 ok 87
124 wakaba 1.1 ok 88
125     ok 89
126     ok 90
127     ok 91
128 wakaba 1.296 not ok 92
129     # Test 92 got: "$VAR1 = [\n qq'ParseError',\n [\n qq'Comment',\n qq'--.'\n ]\n ];\n" (t/HTML-tokenizer.t at line 205 fail #92)
130     # Expected: "$VAR1 = [\n qq'ParseError',\n qq'ParseError',\n [\n qq'Comment',\n qq'--.'\n ]\n ];\n" (<!----.: qq'<!----.')
131     # Line 3 is missing:
132     # - " qq'ParseError',\n"
133 wakaba 1.1 ok 93
134     ok 94
135 wakaba 1.8 ok 95
136     ok 96
137     ok 97
138     ok 98
139     ok 99
140     ok 100
141 wakaba 1.96 ok 101
142     ok 102
143     ok 103
144     ok 104
145 wakaba 1.141 ok 105
146 wakaba 1.286 ok 106
147     ok 107
148     ok 108
149     not ok 109
150     # Test 109 got: "$VAR1 = [\n qq'ParseError',\n [\n qq'DOCTYPE',\n undef,\n undef,\n undef,\n 0\n ]\n ];\n" (t/HTML-tokenizer.t at line 205 fail #109)
151 wakaba 1.47 # Expected: "$VAR1 = [\n qq'ParseError',\n [\n qq'DOCTYPE',\n qq'',\n undef,\n undef,\n 0\n ]\n ];\n" (<!doctype >: qq'<!doctype >')
152 wakaba 1.43 # Line 5 is changed:
153     # - " qq'',\n"
154     # + " undef,\n"
155 wakaba 1.286 not ok 110
156     # Test 110 got: "$VAR1 = [\n qq'ParseError',\n [\n qq'DOCTYPE',\n undef,\n undef,\n undef,\n 0\n ]\n ];\n" (t/HTML-tokenizer.t at line 205 fail #110)
157 wakaba 1.47 # Expected: "$VAR1 = [\n qq'ParseError',\n [\n qq'DOCTYPE',\n qq'',\n undef,\n undef,\n 0\n ]\n ];\n" (<!doctype : qq'<!doctype ')
158 wakaba 1.43 # Line 5 is changed:
159     # - " qq'',\n"
160     # + " undef,\n"
161 wakaba 1.8 ok 111
162     ok 112
163     ok 113
164 wakaba 1.10 ok 114
165 wakaba 1.287 ok 115
166 wakaba 1.10 ok 116
167     ok 117
168     ok 118
169 wakaba 1.287 ok 119
170 wakaba 1.10 ok 120
171     ok 121
172 wakaba 1.39 ok 122
173 wakaba 1.18 ok 123
174 wakaba 1.287 ok 124
175 wakaba 1.18 ok 125
176     ok 126
177 wakaba 1.20 ok 127
178 wakaba 1.240 ok 128
179 wakaba 1.20 ok 129
180 wakaba 1.287 ok 130
181 wakaba 1.240 ok 131
182 wakaba 1.20 ok 132
183     ok 133
184     ok 134
185 wakaba 1.287 ok 135
186 wakaba 1.20 ok 136
187 wakaba 1.303 not ok 137
188     # Test 137 got: "$VAR1 = [\n qq'ParseError'\n ];\n" (t/HTML-tokenizer.t at line 205 fail #137)
189     # Expected: "$VAR1 = [\n qq'ParseError',\n [\n qq'EndTag',\n qq'z'\n ]\n ];\n" (</z: qq'</z')
190     # Line 2 is changed:
191     # - " qq'ParseError',\n"
192     # + " qq'ParseError'\n"
193     # Lines 3-3 are missing:
194     # - " [\n"
195     # - " qq'EndTag',\n"
196     # - " qq'z'\n"
197     # - " ]\n"
198 wakaba 1.21 ok 138
199 wakaba 1.306 not ok 139
200     # Test 139 got: "$VAR1 = [\n qq'ParseError'\n ];\n" (t/HTML-tokenizer.t at line 205 fail #139)
201     # Expected: "$VAR1 = [\n qq'ParseError',\n [\n qq'StartTag',\n qq'z',\n {}\n ]\n ];\n" (<z : qq'<z ')
202     # Line 2 is changed:
203     # - " qq'ParseError',\n"
204     # + " qq'ParseError'\n"
205     # Lines 3-3 are missing:
206     # - " [\n"
207     # - " qq'StartTag',\n"
208     # - " qq'z',\n"
209     # - " {}\n"
210     # - " ]\n"
211 wakaba 1.20 ok 140
212 wakaba 1.306 not ok 141
213     # Test 141 got: "$VAR1 = [\n qq'ParseError',\n qq'ParseError'\n ];\n" (t/HTML-tokenizer.t at line 205 fail #141)
214     # Expected: "$VAR1 = [\n qq'ParseError',\n qq'ParseError',\n [\n qq'StartTag',\n qq'z',\n {}\n ]\n ];\n" (<z/ : qq'<z/ ')
215     # Line 3 is changed:
216     # - " qq'ParseError',\n"
217     # + " qq'ParseError'\n"
218     # Lines 4-4 are missing:
219     # - " [\n"
220     # - " qq'StartTag',\n"
221     # - " qq'z',\n"
222     # - " {}\n"
223     # - " ]\n"
224 wakaba 1.317 not ok 142
225     # Test 142 got: "$VAR1 = [\n qq'ParseError',\n qq'ParseError'\n ];\n" (t/HTML-tokenizer.t at line 205 fail #142)
226     # Expected: "$VAR1 = [\n qq'ParseError',\n qq'ParseError',\n [\n qq'StartTag',\n qq'z',\n {}\n ]\n ];\n" (<z//: qq'<z//')
227     # Line 3 is changed:
228     # - " qq'ParseError',\n"
229     # + " qq'ParseError'\n"
230     # Lines 4-4 are missing:
231     # - " [\n"
232     # - " qq'StartTag',\n"
233     # - " qq'z',\n"
234     # - " {}\n"
235     # - " ]\n"
236 wakaba 1.303 not ok 143
237     # Test 143 got: "$VAR1 = [\n qq'ParseError'\n ];\n" (t/HTML-tokenizer.t at line 205 fail #143)
238     # Expected: "$VAR1 = [\n qq'ParseError',\n [\n qq'StartTag',\n qq'z',\n {}\n ]\n ];\n" (<z: qq'<z')
239     # Line 2 is changed:
240     # - " qq'ParseError',\n"
241     # + " qq'ParseError'\n"
242     # Lines 3-3 are missing:
243     # - " [\n"
244     # - " qq'StartTag',\n"
245     # - " qq'z',\n"
246     # - " {}\n"
247     # - " ]\n"
248     not ok 144
249     # Test 144 got: "$VAR1 = [\n qq'ParseError'\n ];\n" (t/HTML-tokenizer.t at line 205 fail #144)
250     # Expected: "$VAR1 = [\n qq'ParseError',\n [\n qq'EndTag',\n qq'z'\n ]\n ];\n" (</z: qq'</z')
251     # Line 2 is changed:
252     # - " qq'ParseError',\n"
253     # + " qq'ParseError'\n"
254     # Lines 3-3 are missing:
255     # - " [\n"
256     # - " qq'EndTag',\n"
257     # - " qq'z'\n"
258     # - " ]\n"
259     not ok 145
260     # Test 145 got: "$VAR1 = [\n qq'ParseError'\n ];\n" (t/HTML-tokenizer.t at line 205 fail #145)
261     # Expected: "$VAR1 = [\n qq'ParseError',\n [\n qq'StartTag',\n qq'z0',\n {}\n ]\n ];\n" (<z0: qq'<z0')
262     # Line 2 is changed:
263     # - " qq'ParseError',\n"
264     # + " qq'ParseError'\n"
265     # Lines 3-3 are missing:
266     # - " [\n"
267     # - " qq'StartTag',\n"
268     # - " qq'z0',\n"
269     # - " {}\n"
270     # - " ]\n"
271 wakaba 1.286 not ok 146
272     # Test 146 got: "$VAR1 = [\n qq'ParseError',\n qq'ParseError',\n [\n qq'StartTag',\n qq'z',\n {\n 0 => qq''\n }\n ]\n ];\n" (t/HTML-tokenizer.t at line 205 fail #146)
273 wakaba 1.247 # Expected: "$VAR1 = [\n qq'ParseError',\n [\n qq'StartTag',\n qq'z',\n {\n 0 => qq''\n }\n ]\n ];\n" (<z/0=>: qq'<z/0=>')
274     # Got 1 extra line at line 3:
275     # + " qq'ParseError',\n"
276 wakaba 1.309 not ok 147
277     # Test 147 got: "$VAR1 = [\n qq'ParseError',\n qq'ParseError'\n ];\n" (t/HTML-tokenizer.t at line 205 fail #147)
278     # Expected: "$VAR1 = [\n qq'ParseError',\n qq'ParseError',\n [\n qq'StartTag',\n qq'z',\n {\n 0 => qq''\n }\n ]\n ];\n" (<z/0= : qq'<z/0= ')
279     # Line 3 is changed:
280     # - " qq'ParseError',\n"
281     # + " qq'ParseError'\n"
282     # Lines 4-4 are missing:
283     # - " [\n"
284     # - " qq'StartTag',\n"
285     # - " qq'z',\n"
286     # - " {\n"
287     # - " 0 => qq''\n"
288     # - " }\n"
289     # - " ]\n"
290 wakaba 1.239 ok 148
291 wakaba 1.306 not ok 149
292     # Test 149 got: "$VAR1 = [\n qq'ParseError',\n qq'ParseError'\n ];\n" (t/HTML-tokenizer.t at line 205 fail #149)
293     # Expected: "$VAR1 = [\n qq'ParseError',\n qq'ParseError',\n [\n qq'StartTag',\n qq'z',\n {\n 0 => qq'?'\n }\n ]\n ];\n" (<z/0=? : qq'<z/0=? ')
294     # Line 3 is changed:
295     # - " qq'ParseError',\n"
296     # + " qq'ParseError'\n"
297     # Lines 4-4 are missing:
298     # - " [\n"
299     # - " qq'StartTag',\n"
300     # - " qq'z',\n"
301     # - " {\n"
302     # - " 0 => qq'?'\n"
303     # - " }\n"
304     # - " ]\n"
305 wakaba 1.314 not ok 150
306     # Test 150 got: "$VAR1 = [\n qq'ParseError',\n qq'ParseError'\n ];\n" (t/HTML-tokenizer.t at line 205 fail #150)
307     # Expected: "$VAR1 = [\n qq'ParseError',\n qq'ParseError',\n [\n qq'StartTag',\n qq'z',\n {\n 0 => qq'??'\n }\n ]\n ];\n" (<z/0=??: qq'<z/0=??')
308     # Line 3 is changed:
309     # - " qq'ParseError',\n"
310     # + " qq'ParseError'\n"
311     # Lines 4-4 are missing:
312     # - " [\n"
313     # - " qq'StartTag',\n"
314     # - " qq'z',\n"
315     # - " {\n"
316     # - " 0 => qq'??'\n"
317     # - " }\n"
318     # - " ]\n"
319 wakaba 1.316 not ok 151
320     # Test 151 got: "$VAR1 = [\n qq'ParseError',\n qq'ParseError'\n ];\n" (t/HTML-tokenizer.t at line 205 fail #151)
321     # Expected: "$VAR1 = [\n qq'ParseError',\n qq'ParseError',\n [\n qq'StartTag',\n qq'z',\n {\n 0 => qq''\n }\n ]\n ];\n" (<z/0='': qq'<z/0=\x{27}\x{27}')
322     # Line 3 is changed:
323     # - " qq'ParseError',\n"
324     # + " qq'ParseError'\n"
325     # Lines 4-4 are missing:
326     # - " [\n"
327     # - " qq'StartTag',\n"
328     # - " qq'z',\n"
329     # - " {\n"
330     # - " 0 => qq''\n"
331     # - " }\n"
332     # - " ]\n"
333 wakaba 1.312 not ok 152
334     # Test 152 got: "$VAR1 = [\n qq'ParseError',\n qq'ParseError'\n ];\n" (t/HTML-tokenizer.t at line 205 fail #152)
335     # Expected: "$VAR1 = [\n qq'ParseError',\n qq'ParseError',\n [\n qq'StartTag',\n qq'z',\n {\n 0 => qq'&'\n }\n ]\n ];\n" (<z/0='&: qq'<z/0=\x{27}&')
336     # Line 3 is changed:
337     # - " qq'ParseError',\n"
338     # + " qq'ParseError'\n"
339     # Lines 4-4 are missing:
340     # - " [\n"
341     # - " qq'StartTag',\n"
342     # - " qq'z',\n"
343     # - " {\n"
344     # - " 0 => qq'&'\n"
345     # - " }\n"
346     # - " ]\n"
347     not ok 153
348     # Test 153 got: "$VAR1 = [\n qq'ParseError',\n qq'ParseError'\n ];\n" (t/HTML-tokenizer.t at line 205 fail #153)
349     # Expected: "$VAR1 = [\n qq'ParseError',\n qq'ParseError',\n [\n qq'StartTag',\n qq'z',\n {\n 0 => qq'%'\n }\n ]\n ];\n" (<z/0='%: qq'<z/0=\x{27}%')
350     # Line 3 is changed:
351     # - " qq'ParseError',\n"
352     # + " qq'ParseError'\n"
353     # Lines 4-4 are missing:
354     # - " [\n"
355     # - " qq'StartTag',\n"
356     # - " qq'z',\n"
357     # - " {\n"
358     # - " 0 => qq'%'\n"
359     # - " }\n"
360     # - " ]\n"
361 wakaba 1.22 ok 154
362 wakaba 1.316 not ok 155
363     # Test 155 got: "$VAR1 = [\n qq'ParseError',\n qq'ParseError'\n ];\n" (t/HTML-tokenizer.t at line 205 fail #155)
364     # Expected: "$VAR1 = [\n qq'ParseError',\n qq'ParseError',\n [\n qq'StartTag',\n qq'z',\n {\n 0 => qq''\n }\n ]\n ];\n" (<z/0="": qq'<z/0=""')
365     # Line 3 is changed:
366     # - " qq'ParseError',\n"
367     # + " qq'ParseError'\n"
368     # Lines 4-4 are missing:
369     # - " [\n"
370     # - " qq'StartTag',\n"
371     # - " qq'z',\n"
372     # - " {\n"
373     # - " 0 => qq''\n"
374     # - " }\n"
375     # - " ]\n"
376 wakaba 1.22 ok 156
377 wakaba 1.314 not ok 157
378     # Test 157 got: "$VAR1 = [\n qq'ParseError',\n qq'ParseError'\n ];\n" (t/HTML-tokenizer.t at line 205 fail #157)
379     # Expected: "$VAR1 = [\n qq'ParseError',\n qq'ParseError',\n [\n qq'StartTag',\n qq'z',\n {\n 0 => qq'&'\n }\n ]\n ];\n" (<z/0=&: qq'<z/0=&')
380     # Line 3 is changed:
381     # - " qq'ParseError',\n"
382     # + " qq'ParseError'\n"
383     # Lines 4-4 are missing:
384     # - " [\n"
385     # - " qq'StartTag',\n"
386     # - " qq'z',\n"
387     # - " {\n"
388     # - " 0 => qq'&'\n"
389     # - " }\n"
390     # - " ]\n"
391 wakaba 1.28 ok 158
392 wakaba 1.309 not ok 159
393     # Test 159 got: "$VAR1 = [\n qq'ParseError',\n qq'ParseError'\n ];\n" (t/HTML-tokenizer.t at line 205 fail #159)
394     # Expected: "$VAR1 = [\n qq'ParseError',\n qq'ParseError',\n [\n qq'StartTag',\n qq'z',\n {\n 0 => qq''\n }\n ]\n ];\n" (<z/0 =: qq'<z/0 =')
395     # Line 3 is changed:
396     # - " qq'ParseError',\n"
397     # + " qq'ParseError'\n"
398     # Lines 4-4 are missing:
399     # - " [\n"
400     # - " qq'StartTag',\n"
401     # - " qq'z',\n"
402     # - " {\n"
403     # - " 0 => qq''\n"
404     # - " }\n"
405     # - " ]\n"
406 wakaba 1.239 ok 160
407 wakaba 1.308 not ok 161
408     # Test 161 got: "$VAR1 = [\n qq'ParseError',\n qq'ParseError'\n ];\n" (t/HTML-tokenizer.t at line 205 fail #161)
409     # Expected: "$VAR1 = [\n qq'ParseError',\n qq'ParseError',\n [\n qq'StartTag',\n qq'z',\n {\n 0 => qq''\n }\n ]\n ];\n" (<z/0 : qq'<z/0 ')
410     # Line 3 is changed:
411     # - " qq'ParseError',\n"
412     # + " qq'ParseError'\n"
413     # Lines 4-4 are missing:
414     # - " [\n"
415     # - " qq'StartTag',\n"
416     # - " qq'z',\n"
417     # - " {\n"
418     # - " 0 => qq''\n"
419     # - " }\n"
420     # - " ]\n"
421 wakaba 1.317 not ok 162
422     # Test 162 got: "$VAR1 = [\n qq'ParseError',\n qq'ParseError'\n ];\n" (t/HTML-tokenizer.t at line 205 fail #162)
423     # Expected: "$VAR1 = [\n qq'ParseError',\n qq'ParseError',\n [\n qq'StartTag',\n qq'z',\n {\n 0 => qq''\n }\n ]\n ];\n" (<z/0 /: qq'<z/0 /')
424     # Line 3 is changed:
425     # - " qq'ParseError',\n"
426     # + " qq'ParseError'\n"
427     # Lines 4-4 are missing:
428     # - " [\n"
429     # - " qq'StartTag',\n"
430     # - " qq'z',\n"
431     # - " {\n"
432     # - " 0 => qq''\n"
433     # - " }\n"
434     # - " ]\n"
435     not ok 163
436     # Test 163 got: "$VAR1 = [\n qq'ParseError',\n qq'ParseError'\n ];\n" (t/HTML-tokenizer.t at line 205 fail #163)
437     # Expected: "$VAR1 = [\n qq'ParseError',\n qq'ParseError',\n [\n qq'StartTag',\n qq'z',\n {\n 0 => qq''\n }\n ]\n ];\n" (<z/0/: qq'<z/0/')
438     # Line 3 is changed:
439     # - " qq'ParseError',\n"
440     # + " qq'ParseError'\n"
441     # Lines 4-4 are missing:
442     # - " [\n"
443     # - " qq'StartTag',\n"
444     # - " qq'z',\n"
445     # - " {\n"
446     # - " 0 => qq''\n"
447     # - " }\n"
448     # - " ]\n"
449 wakaba 1.307 not ok 164
450     # Test 164 got: "$VAR1 = [\n qq'ParseError',\n qq'ParseError'\n ];\n" (t/HTML-tokenizer.t at line 205 fail #164)
451     # Expected: "$VAR1 = [\n qq'ParseError',\n qq'ParseError',\n [\n qq'StartTag',\n qq'z',\n {\n qq'00' => qq''\n }\n ]\n ];\n" (<z/00: qq'<z/00')
452     # Line 3 is changed:
453     # - " qq'ParseError',\n"
454     # + " qq'ParseError'\n"
455     # Lines 4-4 are missing:
456     # - " [\n"
457     # - " qq'StartTag',\n"
458     # - " qq'z',\n"
459     # - " {\n"
460     # - " qq'00' => qq''\n"
461     # - " }\n"
462     # - " ]\n"
463     not ok 165
464     # Test 165 got: "$VAR1 = [\n qq'ParseError',\n qq'ParseError',\n qq'ParseError'\n ];\n" (t/HTML-tokenizer.t at line 205 fail #165)
465     # Expected: "$VAR1 = [\n qq'ParseError',\n qq'ParseError',\n qq'ParseError',\n [\n qq'StartTag',\n qq'z',\n {\n 0 => qq''\n }\n ]\n ];\n" (<z/0 0: qq'<z/0 0')
466     # Line 4 is changed:
467     # - " qq'ParseError',\n"
468     # + " qq'ParseError'\n"
469     # Lines 5-5 are missing:
470     # - " [\n"
471     # - " qq'StartTag',\n"
472     # - " qq'z',\n"
473     # - " {\n"
474     # - " 0 => qq''\n"
475     # - " }\n"
476     # - " ]\n"
477 wakaba 1.312 not ok 166
478     # Test 166 got: "$VAR1 = [\n qq'ParseError',\n qq'ParseError',\n qq'ParseError'\n ];\n" (t/HTML-tokenizer.t at line 205 fail #166)
479     # Expected: "$VAR1 = [\n qq'ParseError',\n qq'ParseError',\n qq'ParseError',\n [\n qq'StartTag',\n qq'z',\n {\n 0 => qq'\\x{09}'\n }\n ]\n ];\n" (<z/0='&#9: qq'<z/0=\x{27}&#9')
480     # Line 4 is changed:
481     # - " qq'ParseError',\n"
482     # + " qq'ParseError'\n"
483     # Lines 5-5 are missing:
484     # - " [\n"
485     # - " qq'StartTag',\n"
486     # - " qq'z',\n"
487     # - " {\n"
488     # - " 0 => qq'\\x{09}'\n"
489     # - " }\n"
490     # - " ]\n"
491 wakaba 1.28 ok 167
492 wakaba 1.314 not ok 168
493     # Test 168 got: "$VAR1 = [\n qq'ParseError',\n qq'ParseError',\n qq'ParseError'\n ];\n" (t/HTML-tokenizer.t at line 205 fail #168)
494     # Expected: "$VAR1 = [\n qq'ParseError',\n qq'ParseError',\n qq'ParseError',\n [\n qq'StartTag',\n qq'z',\n {\n 0 => qq'\\x{09}'\n }\n ]\n ];\n" (<z/0=&#9: qq'<z/0=&#9')
495     # Line 4 is changed:
496     # - " qq'ParseError',\n"
497     # + " qq'ParseError'\n"
498     # Lines 5-5 are missing:
499     # - " [\n"
500     # - " qq'StartTag',\n"
501     # - " qq'z',\n"
502     # - " {\n"
503     # - " 0 => qq'\\x{09}'\n"
504     # - " }\n"
505     # - " ]\n"
506 wakaba 1.307 not ok 169
507     # Test 169 got: "$VAR1 = [\n qq'ParseError',\n qq'ParseError'\n ];\n" (t/HTML-tokenizer.t at line 205 fail #169)
508     # Expected: "$VAR1 = [\n qq'ParseError',\n qq'ParseError',\n [\n qq'StartTag',\n qq'z',\n {\n qq'0z' => qq''\n }\n ]\n ];\n" (<z/0z: qq'<z/0z')
509     # Line 3 is changed:
510     # - " qq'ParseError',\n"
511     # + " qq'ParseError'\n"
512     # Lines 4-4 are missing:
513     # - " [\n"
514     # - " qq'StartTag',\n"
515     # - " qq'z',\n"
516     # - " {\n"
517     # - " qq'0z' => qq''\n"
518     # - " }\n"
519     # - " ]\n"
520     not ok 170
521     # Test 170 got: "$VAR1 = [\n qq'ParseError',\n qq'ParseError'\n ];\n" (t/HTML-tokenizer.t at line 205 fail #170)
522     # Expected: "$VAR1 = [\n qq'ParseError',\n qq'ParseError',\n [\n qq'StartTag',\n qq'z',\n {\n 0 => qq'',\n qq'z' => qq''\n }\n ]\n ];\n" (<z/0 z: qq'<z/0 z')
523     # Line 3 is changed:
524     # - " qq'ParseError',\n"
525     # + " qq'ParseError'\n"
526     # Lines 4-4 are missing:
527     # - " [\n"
528     # - " qq'StartTag',\n"
529     # - " qq'z',\n"
530     # - " {\n"
531     # - " 0 => qq'',\n"
532     # - " qq'z' => qq''\n"
533     # - " }\n"
534     # - " ]\n"
535 wakaba 1.303 not ok 171
536     # Test 171 got: "$VAR1 = [\n qq'ParseError'\n ];\n" (t/HTML-tokenizer.t at line 205 fail #171)
537     # Expected: "$VAR1 = [\n qq'ParseError',\n [\n qq'StartTag',\n qq'zz',\n {}\n ]\n ];\n" (<zz: qq'<zz')
538     # Line 2 is changed:
539     # - " qq'ParseError',\n"
540     # + " qq'ParseError'\n"
541     # Lines 3-3 are missing:
542     # - " [\n"
543     # - " qq'StartTag',\n"
544     # - " qq'zz',\n"
545     # - " {}\n"
546     # - " ]\n"
547 wakaba 1.307 not ok 172
548     # Test 172 got: "$VAR1 = [\n qq'ParseError',\n qq'ParseError'\n ];\n" (t/HTML-tokenizer.t at line 205 fail #172)
549     # Expected: "$VAR1 = [\n qq'ParseError',\n qq'ParseError',\n [\n qq'StartTag',\n qq'z',\n {\n qq'z' => qq''\n }\n ]\n ];\n" (<z/z: qq'<z/z')
550     # Line 3 is changed:
551     # - " qq'ParseError',\n"
552     # + " qq'ParseError'\n"
553     # Lines 4-4 are missing:
554     # - " [\n"
555     # - " qq'StartTag',\n"
556     # - " qq'z',\n"
557     # - " {\n"
558     # - " qq'z' => qq''\n"
559     # - " }\n"
560     # - " ]\n"
561 wakaba 1.286 # t/tokenizer/test4.test
562 wakaba 1.299 not ok 173
563 wakaba 1.307 # Test 173 got: "$VAR1 = [\n qq'ParseError',\n qq'ParseError',\n qq'ParseError'\n ];\n" (t/HTML-tokenizer.t at line 205 fail #173)
564 wakaba 1.299 # Expected: "$VAR1 = [\n qq'ParseError',\n qq'ParseError',\n [\n qq'StartTag',\n qq'z',\n {\n 0 => qq'',\n qq'<' => qq''\n }\n ]\n ];\n" (< in attribute name: qq'<z/0 <')
565 wakaba 1.307 # Line 4 is changed:
566     # - " [\n"
567     # + " qq'ParseError'\n"
568     # Lines 5-5 are missing:
569     # - " qq'StartTag',\n"
570     # - " qq'z',\n"
571     # - " {\n"
572     # - " 0 => qq'',\n"
573     # - " qq'<' => qq''\n"
574     # - " }\n"
575     # - " ]\n"
576 wakaba 1.293 not ok 174
577 wakaba 1.314 # Test 174 got: "$VAR1 = [\n qq'ParseError',\n qq'ParseError'\n ];\n" (t/HTML-tokenizer.t at line 205 fail #174)
578 wakaba 1.293 # Expected: "$VAR1 = [\n qq'ParseError',\n [\n qq'StartTag',\n qq'z',\n {\n qq'x' => qq'<'\n }\n ]\n ];\n" (< in attribute value: qq'<z x=<')
579 wakaba 1.314 # Line 3 is changed:
580     # - " [\n"
581     # + " qq'ParseError'\n"
582     # Lines 4-4 are missing:
583     # - " qq'StartTag',\n"
584     # - " qq'z',\n"
585     # - " {\n"
586     # - " qq'x' => qq'<'\n"
587     # - " }\n"
588     # - " ]\n"
589 wakaba 1.286 ok 175
590     ok 176
591     not ok 177
592     # Test 177 got: "$VAR1 = [\n qq'ParseError',\n qq'ParseError',\n [\n qq'StartTag',\n qq'z',\n {\n qq'=' => qq''\n }\n ]\n ];\n" (t/HTML-tokenizer.t at line 205 fail #177)
593 wakaba 1.247 # Expected: "$VAR1 = [\n qq'ParseError',\n [\n qq'StartTag',\n qq'z',\n {\n qq'=' => qq''\n }\n ]\n ];\n" (== attribute: qq'<z ==>')
594     # Got 1 extra line at line 3:
595     # + " qq'ParseError',\n"
596 wakaba 1.28 ok 178
597 wakaba 1.33 ok 179
598 wakaba 1.34 ok 180
599 wakaba 1.38 ok 181
600     ok 182
601 wakaba 1.43 ok 183
602     ok 184
603     ok 185
604     ok 186
605     ok 187
606     ok 188
607 wakaba 1.240 ok 189
608     ok 190
609 wakaba 1.43 ok 191
610     ok 192
611     ok 193
612     ok 194
613     ok 195
614     ok 196
615 wakaba 1.306 not ok 197
616     # Test 197 got: "$VAR1 = [\n qq'ParseError'\n ];\n" (t/HTML-tokenizer.t at line 205 fail #197)
617     # Expected: "$VAR1 = [\n qq'ParseError',\n [\n qq'StartTag',\n qq'z',\n {}\n ]\n ];\n" (CR EOF in tag name: qq'<z\x{0D}')
618     # Line 2 is changed:
619     # - " qq'ParseError',\n"
620     # + " qq'ParseError'\n"
621     # Lines 3-3 are missing:
622     # - " [\n"
623     # - " qq'StartTag',\n"
624     # - " qq'z',\n"
625     # - " {}\n"
626     # - " ]\n"
627 wakaba 1.96 ok 198
628     ok 199
629 wakaba 1.286 ok 200
630 wakaba 1.96 ok 201
631 wakaba 1.130 ok 202
632 wakaba 1.43 ok 203
633     ok 204
634 wakaba 1.318 not ok 205
635     # Test 205 got: "$VAR1 = [\n qq'ParseError',\n [\n qq'Character',\n qq'\\x{10FFFF}'\n ]\n ];\n" (t/HTML-tokenizer.t at line 205 fail #205)
636     # Expected: "$VAR1 = [\n qq'ParseError',\n [\n qq'Character',\n qq'\\x{FFFD}'\n ]\n ];\n" (Maximum non-BMP numeric entity: qq'&#X10FFFF;')
637     # Line 5 is changed:
638     # - " qq'\\x{FFFD}'\n"
639     # + " qq'\\x{10FFFF}'\n"
640 wakaba 1.43 ok 206
641     ok 207
642     ok 208
643     ok 209
644     ok 210
645     ok 211
646 wakaba 1.318 not ok 212
647     # Test 212 got: "$VAR1 = [\n [\n qq'Character',\n qq'\\x{D7FF}'\n ],\n qq'ParseError',\n [\n qq'Character',\n qq'\\x{D800}'\n ],\n qq'ParseError',\n [\n qq'Character',\n qq'\\x{D801}'\n ],\n qq'ParseError',\n [\n qq'Character',\n qq'\\x{DFFE}'\n ],\n qq'ParseError',\n [\n qq'Character',\n qq'\\x{DFFF}\\x{E000}'\n ]\n ];\n" (t/HTML-tokenizer.t at line 205 fail #212)
648     # Expected: "$VAR1 = [\n [\n qq'Character',\n qq'\\x{D7FF}'\n ],\n qq'ParseError',\n [\n qq'Character',\n qq'\\x{FFFD}'\n ],\n qq'ParseError',\n [\n qq'Character',\n qq'\\x{FFFD}'\n ],\n qq'ParseError',\n [\n qq'Character',\n qq'\\x{FFFD}'\n ],\n qq'ParseError',\n [\n qq'Character',\n qq'\\x{FFFD}\\x{E000}'\n ]\n ];\n" (Surrogate code point edge cases: qq'&#xD7FF;&#xD800;&#xD801;&#xDFFE;&#xDFFF;&#xE000;')
649     # Line 9 is changed:
650     # - " qq'\\x{FFFD}'\n"
651     # + " qq'\\x{D800}'\n"
652     # Line 14 is changed:
653     # - " qq'\\x{FFFD}'\n"
654     # + " qq'\\x{D801}'\n"
655     # Line 19 is changed:
656     # - " qq'\\x{FFFD}'\n"
657     # + " qq'\\x{DFFE}'\n"
658     # Line 24 is changed:
659     # - " qq'\\x{FFFD}\\x{E000}'\n"
660     # + " qq'\\x{DFFF}\\x{E000}'\n"
661 wakaba 1.43 ok 213
662     ok 214
663 wakaba 1.240 ok 215
664     ok 216
665 wakaba 1.43 ok 217
666     ok 218
667     ok 219
668     ok 220
669 wakaba 1.141 ok 221
670 wakaba 1.286 ok 222
671 wakaba 1.298 not ok 223
672     # Test 223 got: "$VAR1 = [\n [\n qq'DOCTYPE',\n qq'html',\n qq'AbC',\n qq'XyZ',\n 1\n ]\n ];\n" (t/HTML-tokenizer.t at line 205 fail #223)
673     # Expected: "$VAR1 = [\n [\n qq'DOCTYPE',\n qq'HtMl',\n qq'AbC',\n qq'XyZ',\n 1\n ]\n ];\n" (Doctype public case-sensitivity (1): qq'<!DoCtYpE HtMl PuBlIc "AbC" "XyZ">')
674     # Line 4 is changed:
675     # - " qq'HtMl',\n"
676     # + " qq'html',\n"
677     not ok 224
678     # Test 224 got: "$VAR1 = [\n [\n qq'DOCTYPE',\n qq'html',\n qq'aBc',\n qq'xYz',\n 1\n ]\n ];\n" (t/HTML-tokenizer.t at line 205 fail #224)
679     # Expected: "$VAR1 = [\n [\n qq'DOCTYPE',\n qq'hTmL',\n qq'aBc',\n qq'xYz',\n 1\n ]\n ];\n" (Doctype public case-sensitivity (2): qq'<!dOcTyPe hTmL pUbLiC "aBc" "xYz">')
680     # Line 4 is changed:
681     # - " qq'hTmL',\n"
682     # + " qq'html',\n"
683     not ok 225
684     # Test 225 got: "$VAR1 = [\n [\n qq'DOCTYPE',\n qq'html',\n undef,\n qq'XyZ',\n 1\n ]\n ];\n" (t/HTML-tokenizer.t at line 205 fail #225)
685     # Expected: "$VAR1 = [\n [\n qq'DOCTYPE',\n qq'HtMl',\n undef,\n qq'XyZ',\n 1\n ]\n ];\n" (Doctype system case-sensitivity (1): qq'<!DoCtYpE HtMl SyStEm "XyZ">')
686     # Line 4 is changed:
687     # - " qq'HtMl',\n"
688     # + " qq'html',\n"
689     not ok 226
690     # Test 226 got: "$VAR1 = [\n [\n qq'DOCTYPE',\n qq'html',\n undef,\n qq'xYz',\n 1\n ]\n ];\n" (t/HTML-tokenizer.t at line 205 fail #226)
691     # Expected: "$VAR1 = [\n [\n qq'DOCTYPE',\n qq'hTmL',\n undef,\n qq'xYz',\n 1\n ]\n ];\n" (Doctype system case-sensitivity (2): qq'<!dOcTyPe hTmL sYsTeM "xYz">')
692     # Line 4 is changed:
693     # - " qq'hTmL',\n"
694     # + " qq'html',\n"
695 wakaba 1.286 not ok 227
696     # Test 227 got: "$VAR1 = [\n qq'ParseError',\n qq'ParseError',\n [\n qq'Comment',\n qq'doc'\n ],\n [\n qq'Character',\n qq'\\x{FFFD}'\n ]\n ];\n" (t/HTML-tokenizer.t at line 205 fail #227)
697 wakaba 1.130 # Expected: "$VAR1 = [\n qq'ParseError',\n [\n qq'Comment',\n qq'doc'\n ],\n qq'ParseError',\n [\n qq'Character',\n qq'\\x{FFFD}'\n ]\n ];\n" (U+0000 in lookahead region after non-matching character: qq'<!doc>\x{00}')
698     # Got 1 extra line at line 3:
699     # + " qq'ParseError',\n"
700     # Line 8 is missing:
701     # - " qq'ParseError',\n"
702 wakaba 1.43 ok 228
703     ok 229
704     ok 230
705     ok 231
706     ok 232
707     ok 233
708     ok 234
709     ok 235
710 wakaba 1.141 ok 236
711 wakaba 1.43 ok 237
712     ok 238
713     ok 239
714     ok 240
715     ok 241
716 wakaba 1.287 ok 242
717 wakaba 1.43 ok 243
718 wakaba 1.287 ok 244
719 wakaba 1.286 # t/tokenizer/contentModelFlags.test
720 wakaba 1.43 ok 245
721     ok 246
722     ok 247
723     ok 248
724 wakaba 1.141 ok 249
725 wakaba 1.43 ok 250
726     ok 251
727     ok 252
728     ok 253
729     ok 254
730     ok 255
731 wakaba 1.141 ok 256
732 wakaba 1.43 ok 257
733 wakaba 1.286 # t/tokenizer/escapeFlag.test
734 wakaba 1.43 ok 258
735     ok 259
736     ok 260
737     ok 261
738     ok 262
739 wakaba 1.206 ok 263
740 wakaba 1.43 ok 264
741     ok 265
742     ok 266
743 wakaba 1.286 # t/tokenizer/entities.test
744 wakaba 1.43 ok 267
745     ok 268
746     ok 269
747     ok 270
748     ok 271
749     ok 272
750     ok 273
751     ok 274
752     ok 275
753     ok 276
754     ok 277
755     ok 278
756     ok 279
757     ok 280
758     ok 281
759     ok 282
760     ok 283
761     ok 284
762     ok 285
763     ok 286
764     ok 287
765     ok 288
766     ok 289
767     ok 290
768     ok 291
769     ok 292
770     ok 293
771     ok 294
772     ok 295
773     ok 296
774     ok 297
775     ok 298
776     ok 299
777     ok 300
778     ok 301
779     ok 302
780     ok 303
781     ok 304
782     ok 305
783     ok 306
784     ok 307
785     ok 308
786     ok 309
787     ok 310
788     ok 311
789     ok 312
790     ok 313
791     ok 314
792     ok 315
793     ok 316
794     ok 317
795     ok 318
796     ok 319
797     ok 320
798     ok 321
799     ok 322
800     ok 323
801     ok 324
802     ok 325
803     ok 326
804     ok 327
805     ok 328
806     ok 329
807     ok 330
808     ok 331
809     ok 332
810     ok 333
811     ok 334
812     ok 335
813     ok 336
814     ok 337
815 wakaba 1.59 ok 338
816     ok 339
817     ok 340
818     ok 341
819     ok 342
820     ok 343
821     ok 344
822     ok 345
823     ok 346
824     ok 347
825 wakaba 1.62 ok 348
826     ok 349
827     ok 350
828     ok 351
829     ok 352
830     ok 353
831     ok 354
832     ok 355
833     ok 356
834     ok 357
835     ok 358
836     ok 359
837 wakaba 1.96 ok 360
838     ok 361
839     ok 362
840     ok 363
841 wakaba 1.129 ok 364
842     ok 365
843     ok 366
844     ok 367
845     ok 368
846     ok 369
847     ok 370
848     ok 371
849     ok 372
850     ok 373
851     ok 374
852     ok 375
853     ok 376
854     ok 377
855     ok 378
856     ok 379
857     ok 380
858     ok 381
859     ok 382
860     ok 383
861     ok 384
862     ok 385
863     ok 386
864     ok 387
865     ok 388
866     ok 389
867     ok 390
868     ok 391
869     ok 392
870     ok 393
871     ok 394
872     ok 395
873     ok 396
874 wakaba 1.130 ok 397
875     ok 398
876     ok 399
877     ok 400
878     ok 401
879     ok 402
880     ok 403
881     ok 404
882     ok 405
883     ok 406
884     ok 407
885     ok 408
886     ok 409
887     ok 410
888     ok 411
889     ok 412
890     ok 413
891     ok 414
892     ok 415
893     ok 416
894 wakaba 1.132 ok 417
895     ok 418
896     ok 419
897     ok 420
898 wakaba 1.136 ok 421
899     ok 422
900     ok 423
901     ok 424
902     ok 425
903     ok 426
904     ok 427
905     ok 428
906     ok 429
907     ok 430
908     ok 431
909     ok 432
910     ok 433
911     ok 434
912 wakaba 1.205 ok 435
913 wakaba 1.136 ok 436
914     ok 437
915     ok 438
916 wakaba 1.205 ok 439
917 wakaba 1.136 ok 440
918     ok 441
919     ok 442
920 wakaba 1.205 ok 443
921 wakaba 1.136 ok 444
922     ok 445
923 wakaba 1.205 ok 446
924 wakaba 1.136 ok 447
925     ok 448
926     ok 449
927     ok 450
928     ok 451
929     ok 452
930     ok 453
931     ok 454
932     ok 455
933     ok 456
934     ok 457
935     ok 458
936     ok 459
937     ok 460
938     ok 461
939     ok 462
940     ok 463
941     ok 464
942     ok 465
943     ok 466
944     ok 467
945     ok 468
946     ok 469
947     ok 470
948     ok 471
949 wakaba 1.141 ok 472
950 wakaba 1.195 ok 473
951     ok 474
952     ok 475
953     ok 476
954     ok 477
955 wakaba 1.205 ok 478
956     ok 479
957     ok 480
958     ok 481
959     ok 482
960     ok 483
961     ok 484
962     ok 485
963     ok 486
964     ok 487
965     ok 488
966     ok 489
967     ok 490
968     ok 491
969     ok 492
970     ok 493
971     ok 494
972     ok 495
973     ok 496
974     ok 497
975     ok 498
976     ok 499
977     ok 500
978     ok 501
979     ok 502
980     ok 503
981     ok 504
982     ok 505
983     ok 506
984     ok 507
985     ok 508
986     ok 509
987     ok 510
988     ok 511
989     ok 512
990     ok 513
991     ok 514
992     ok 515
993     ok 516
994     ok 517
995     ok 518
996     ok 519
997     ok 520
998     ok 521
999     ok 522
1000     ok 523
1001     ok 524
1002     ok 525
1003     ok 526
1004     ok 527
1005     ok 528
1006     ok 529
1007     ok 530
1008     ok 531
1009     ok 532
1010     ok 533
1011     ok 534
1012     ok 535
1013     ok 536
1014     ok 537
1015     ok 538
1016     ok 539
1017 wakaba 1.210 ok 540
1018 wakaba 1.205 ok 541
1019     ok 542
1020     ok 543
1021     ok 544
1022     ok 545
1023     ok 546
1024     ok 547
1025     ok 548
1026     ok 549
1027     ok 550
1028     ok 551
1029     ok 552
1030     ok 553
1031     ok 554
1032     ok 555
1033     ok 556
1034     ok 557
1035     ok 558
1036     ok 559
1037     ok 560
1038     ok 561
1039     ok 562
1040     ok 563
1041     ok 564
1042     ok 565
1043     ok 566
1044     ok 567
1045     ok 568
1046     ok 569
1047     ok 570
1048     ok 571
1049     ok 572
1050     ok 573
1051     ok 574
1052     ok 575
1053     ok 576
1054     ok 577
1055     ok 578
1056     ok 579
1057     ok 580
1058     ok 581
1059     ok 582
1060     ok 583
1061     ok 584
1062     ok 585
1063     ok 586
1064     ok 587
1065     ok 588
1066     ok 589
1067     ok 590
1068     ok 591
1069     ok 592
1070     ok 593
1071     ok 594
1072     ok 595
1073     ok 596
1074     ok 597
1075     ok 598
1076     ok 599
1077     ok 600
1078     ok 601
1079     ok 602
1080     ok 603
1081     ok 604
1082     ok 605
1083     ok 606
1084     ok 607
1085     ok 608
1086     ok 609
1087     ok 610
1088     ok 611
1089     ok 612
1090     ok 613
1091     ok 614
1092     ok 615
1093     ok 616
1094     ok 617
1095     ok 618
1096     ok 619
1097     ok 620
1098     ok 621
1099     ok 622
1100     ok 623
1101     ok 624
1102     ok 625
1103     ok 626
1104     ok 627
1105     ok 628
1106     ok 629
1107     ok 630
1108     ok 631
1109     ok 632
1110     ok 633
1111     ok 634
1112     ok 635
1113     ok 636
1114     ok 637
1115     ok 638
1116     ok 639
1117     ok 640
1118     ok 641
1119     ok 642
1120     ok 643
1121     ok 644
1122     ok 645
1123     ok 646
1124     ok 647
1125     ok 648
1126     ok 649
1127     ok 650
1128     ok 651
1129     ok 652
1130     ok 653
1131     ok 654
1132     ok 655
1133     ok 656
1134     ok 657
1135     ok 658
1136     ok 659
1137     ok 660
1138     ok 661
1139     ok 662
1140     ok 663
1141     ok 664
1142     ok 665
1143     ok 666
1144     ok 667
1145     ok 668
1146     ok 669
1147     ok 670
1148     ok 671
1149     ok 672
1150     ok 673
1151     ok 674
1152     ok 675
1153     ok 676
1154     ok 677
1155     ok 678
1156     ok 679
1157     ok 680
1158     ok 681
1159     ok 682
1160     ok 683
1161     ok 684
1162     ok 685
1163     ok 686
1164     ok 687
1165     ok 688
1166     ok 689
1167     ok 690
1168     ok 691
1169     ok 692
1170     ok 693
1171     ok 694
1172     ok 695
1173     ok 696
1174     ok 697
1175     ok 698
1176     ok 699
1177     ok 700
1178     ok 701
1179     ok 702
1180     ok 703
1181     ok 704
1182     ok 705
1183     ok 706
1184     ok 707
1185     ok 708
1186     ok 709
1187     ok 710
1188     ok 711
1189     ok 712
1190     ok 713
1191     ok 714
1192     ok 715
1193     ok 716
1194     ok 717
1195     ok 718
1196     ok 719
1197     ok 720
1198     ok 721
1199     ok 722
1200     ok 723
1201     ok 724
1202     ok 725
1203     ok 726
1204     ok 727
1205     ok 728
1206     ok 729
1207     ok 730
1208     ok 731
1209     ok 732
1210     ok 733
1211     ok 734
1212     ok 735
1213     ok 736
1214     ok 737
1215     ok 738
1216     ok 739
1217     ok 740
1218     ok 741
1219     ok 742
1220     ok 743
1221     ok 744
1222     ok 745
1223     ok 746
1224     ok 747
1225     ok 748
1226     ok 749
1227     ok 750
1228     ok 751
1229     ok 752
1230     ok 753
1231     ok 754
1232     ok 755
1233     ok 756
1234     ok 757
1235     ok 758
1236     ok 759
1237     ok 760
1238     ok 761
1239     ok 762
1240     ok 763
1241     ok 764
1242     ok 765
1243     ok 766
1244     ok 767
1245     ok 768
1246     ok 769
1247     ok 770
1248     ok 771
1249     ok 772
1250     ok 773
1251     ok 774
1252     ok 775
1253     ok 776
1254     ok 777
1255     ok 778
1256     ok 779
1257     ok 780
1258     ok 781
1259     ok 782
1260     ok 783
1261     ok 784
1262     ok 785
1263     ok 786
1264     ok 787
1265     ok 788
1266     ok 789
1267 wakaba 1.318 not ok 790
1268     # Test 790 got: "$VAR1 = [\n qq'ParseError',\n [\n qq'Character',\n qq'\\x{81}'\n ]\n ];\n" (t/HTML-tokenizer.t at line 205 fail #790)
1269     # Expected: "$VAR1 = [\n qq'ParseError',\n [\n qq'Character',\n qq'\\x{FFFD}'\n ]\n ];\n" (Windows-1252 REPLACEMENT CHAR numeric entity.: qq'&#0129;')
1270     # Line 5 is changed:
1271     # - " qq'\\x{FFFD}'\n"
1272     # + " qq'\\x{81}'\n"
1273 wakaba 1.205 ok 791
1274     ok 792
1275     ok 793
1276     ok 794
1277     ok 795
1278     ok 796
1279     ok 797
1280     ok 798
1281     ok 799
1282     ok 800
1283     ok 801
1284 wakaba 1.318 not ok 802
1285     # Test 802 got: "$VAR1 = [\n qq'ParseError',\n [\n qq'Character',\n qq'\\x{8D}'\n ]\n ];\n" (t/HTML-tokenizer.t at line 205 fail #802)
1286     # Expected: "$VAR1 = [\n qq'ParseError',\n [\n qq'Character',\n qq'\\x{FFFD}'\n ]\n ];\n" (Windows-1252 REPLACEMENT CHAR numeric entity.: qq'&#0141;')
1287     # Line 5 is changed:
1288     # - " qq'\\x{FFFD}'\n"
1289     # + " qq'\\x{8D}'\n"
1290 wakaba 1.205 ok 803
1291 wakaba 1.318 not ok 804
1292     # Test 804 got: "$VAR1 = [\n qq'ParseError',\n [\n qq'Character',\n qq'\\x{8F}'\n ]\n ];\n" (t/HTML-tokenizer.t at line 205 fail #804)
1293     # Expected: "$VAR1 = [\n qq'ParseError',\n [\n qq'Character',\n qq'\\x{FFFD}'\n ]\n ];\n" (Windows-1252 REPLACEMENT CHAR numeric entity.: qq'&#0143;')
1294     # Line 5 is changed:
1295     # - " qq'\\x{FFFD}'\n"
1296     # + " qq'\\x{8F}'\n"
1297     not ok 805
1298     # Test 805 got: "$VAR1 = [\n qq'ParseError',\n [\n qq'Character',\n qq'\\x{90}'\n ]\n ];\n" (t/HTML-tokenizer.t at line 205 fail #805)
1299     # Expected: "$VAR1 = [\n qq'ParseError',\n [\n qq'Character',\n qq'\\x{FFFD}'\n ]\n ];\n" (Windows-1252 REPLACEMENT CHAR numeric entity.: qq'&#0144;')
1300     # Line 5 is changed:
1301     # - " qq'\\x{FFFD}'\n"
1302     # + " qq'\\x{90}'\n"
1303 wakaba 1.205 ok 806
1304     ok 807
1305     ok 808
1306     ok 809
1307     ok 810
1308     ok 811
1309     ok 812
1310     ok 813
1311     ok 814
1312     ok 815
1313     ok 816
1314     ok 817
1315 wakaba 1.318 not ok 818
1316     # Test 818 got: "$VAR1 = [\n qq'ParseError',\n [\n qq'Character',\n qq'\\x{9D}'\n ]\n ];\n" (t/HTML-tokenizer.t at line 205 fail #818)
1317     # Expected: "$VAR1 = [\n qq'ParseError',\n [\n qq'Character',\n qq'\\x{FFFD}'\n ]\n ];\n" (Windows-1252 REPLACEMENT CHAR numeric entity.: qq'&#0157;')
1318     # Line 5 is changed:
1319     # - " qq'\\x{FFFD}'\n"
1320     # + " qq'\\x{9D}'\n"
1321 wakaba 1.205 ok 819
1322 wakaba 1.318 not ok 820
1323     # Test 820 got: "$VAR1 = [\n qq'ParseError',\n [\n qq'Character',\n qq'\\x{81}'\n ]\n ];\n" (t/HTML-tokenizer.t at line 205 fail #820)
1324     # Expected: "$VAR1 = [\n qq'ParseError',\n [\n qq'Character',\n qq'\\x{FFFD}'\n ]\n ];\n" (Windows-1252 REPLACEMENT CHAR hexadecimal numeric entity.: qq'&#x081;')
1325     # Line 5 is changed:
1326     # - " qq'\\x{FFFD}'\n"
1327     # + " qq'\\x{81}'\n"
1328 wakaba 1.205 ok 821
1329     ok 822
1330     ok 823
1331     ok 824
1332     ok 825
1333     ok 826
1334     ok 827
1335     ok 828
1336     ok 829
1337     ok 830
1338     ok 831
1339 wakaba 1.318 not ok 832
1340     # Test 832 got: "$VAR1 = [\n qq'ParseError',\n [\n qq'Character',\n qq'\\x{8D}'\n ]\n ];\n" (t/HTML-tokenizer.t at line 205 fail #832)
1341     # Expected: "$VAR1 = [\n qq'ParseError',\n [\n qq'Character',\n qq'\\x{FFFD}'\n ]\n ];\n" (Windows-1252 REPLACEMENT CHAR hexadecimal numeric entity.: qq'&#x08D;')
1342     # Line 5 is changed:
1343     # - " qq'\\x{FFFD}'\n"
1344     # + " qq'\\x{8D}'\n"
1345 wakaba 1.205 ok 833
1346 wakaba 1.318 not ok 834
1347     # Test 834 got: "$VAR1 = [\n qq'ParseError',\n [\n qq'Character',\n qq'\\x{8F}'\n ]\n ];\n" (t/HTML-tokenizer.t at line 205 fail #834)
1348     # Expected: "$VAR1 = [\n qq'ParseError',\n [\n qq'Character',\n qq'\\x{FFFD}'\n ]\n ];\n" (Windows-1252 REPLACEMENT CHAR hexadecimal numeric entity.: qq'&#x08F;')
1349     # Line 5 is changed:
1350     # - " qq'\\x{FFFD}'\n"
1351     # + " qq'\\x{8F}'\n"
1352     not ok 835
1353     # Test 835 got: "$VAR1 = [\n qq'ParseError',\n [\n qq'Character',\n qq'\\x{90}'\n ]\n ];\n" (t/HTML-tokenizer.t at line 205 fail #835)
1354     # Expected: "$VAR1 = [\n qq'ParseError',\n [\n qq'Character',\n qq'\\x{FFFD}'\n ]\n ];\n" (Windows-1252 REPLACEMENT CHAR hexadecimal numeric entity.: qq'&#x090;')
1355     # Line 5 is changed:
1356     # - " qq'\\x{FFFD}'\n"
1357     # + " qq'\\x{90}'\n"
1358 wakaba 1.205 ok 836
1359     ok 837
1360     ok 838
1361     ok 839
1362     ok 840
1363     ok 841
1364     ok 842
1365     ok 843
1366     ok 844
1367     ok 845
1368 wakaba 1.286 ok 846
1369     ok 847
1370 wakaba 1.318 not ok 848
1371     # Test 848 got: "$VAR1 = [\n qq'ParseError',\n [\n qq'Character',\n qq'\\x{9D}'\n ]\n ];\n" (t/HTML-tokenizer.t at line 205 fail #848)
1372     # Expected: "$VAR1 = [\n qq'ParseError',\n [\n qq'Character',\n qq'\\x{FFFD}'\n ]\n ];\n" (Windows-1252 REPLACEMENT CHAR hexadecimal numeric entity.: qq'&#x09D;')
1373     # Line 5 is changed:
1374     # - " qq'\\x{FFFD}'\n"
1375     # + " qq'\\x{9D}'\n"
1376 wakaba 1.286 ok 849
1377     ok 850
1378 wakaba 1.205 # t/tokenizer/xmlViolation.test
1379 wakaba 1.286 not ok 851
1380     # Test 851 got: "$VAR1 = [\n qq'ParseError',\n [\n qq'Character',\n qq'a\\x{FFFF}b'\n ]\n ];\n" (t/HTML-tokenizer.t at line 205 fail #851)
1381 wakaba 1.206 # Expected: "$VAR1 = [\n qq'ParseError',\n [\n qq'Character',\n qq'a\\x{FFFD}b'\n ]\n ];\n" (Non-XML character: qq'a\x{FFFF}b')
1382     # Line 5 is changed:
1383     # - " qq'a\\x{FFFD}b'\n"
1384     # + " qq'a\\x{FFFF}b'\n"
1385 wakaba 1.286 not ok 852
1386     # Test 852 got: "$VAR1 = [\n [\n qq'Character',\n qq'a\\x{0C}b'\n ]\n ];\n" (t/HTML-tokenizer.t at line 205 fail #852)
1387 wakaba 1.206 # Expected: "$VAR1 = [\n [\n qq'Character',\n qq'a b'\n ]\n ];\n" (Non-XML space: qq'a\x{0C}b')
1388     # Line 4 is changed:
1389     # - " qq'a b'\n"
1390     # + " qq'a\\x{0C}b'\n"
1391 wakaba 1.286 not ok 853
1392 wakaba 1.302 # Test 853 got: "$VAR1 = [\n qq'ParseError',\n [\n qq'Comment',\n qq' foo -- bar '\n ]\n ];\n" (t/HTML-tokenizer.t at line 205 fail #853)
1393 wakaba 1.206 # Expected: "$VAR1 = [\n qq'ParseError',\n [\n qq'Comment',\n qq' foo - - bar '\n ]\n ];\n" (Double hyphen in comment: qq'<!-- foo -- bar -->')
1394 wakaba 1.302 # Line 5 is changed:
1395 wakaba 1.206 # - " qq' foo - - bar '\n"
1396     # + " qq' foo -- bar '\n"
1397 wakaba 1.286 ok 854
1398 wakaba 1.205 # t/tokenizer-test-1.test
1399     ok 855
1400     ok 856
1401     ok 857
1402     ok 858
1403     ok 859
1404     ok 860
1405     ok 861
1406     ok 862
1407     ok 863
1408     ok 864
1409     ok 865
1410     ok 866
1411     ok 867
1412     ok 868
1413     ok 869
1414     ok 870
1415     ok 871
1416     ok 872
1417     ok 873
1418     ok 874
1419     ok 875
1420     ok 876
1421     ok 877
1422     ok 878
1423     ok 879
1424     ok 880
1425     ok 881
1426     ok 882
1427     ok 883
1428     ok 884
1429     ok 885
1430     ok 886
1431     ok 887
1432     ok 888
1433     ok 889
1434     ok 890
1435     ok 891
1436     ok 892
1437     ok 893
1438     ok 894
1439     ok 895
1440     ok 896
1441     ok 897
1442     ok 898
1443     ok 899
1444     ok 900
1445     ok 901
1446     ok 902
1447     ok 903
1448     ok 904
1449     ok 905
1450     ok 906
1451     ok 907
1452     ok 908
1453     ok 909
1454     ok 910
1455     ok 911
1456     ok 912
1457     ok 913
1458     ok 914
1459     ok 915
1460     ok 916
1461     ok 917
1462     ok 918
1463     ok 919
1464     ok 920
1465     ok 921
1466     ok 922
1467     ok 923
1468     ok 924
1469     ok 925
1470 wakaba 1.298 ok 926
1471     ok 927
1472     not ok 928
1473     # Test 928 got: "$VAR1 = [\n [\n qq'Comment',\n qq'--x'\n ]\n ];\n" (t/HTML-tokenizer.t at line 205 fail #928)
1474 wakaba 1.296 # Expected: "$VAR1 = [\n qq'ParseError',\n [\n qq'Comment',\n qq'--x'\n ]\n ];\n" (<!----x-->: qq'<!----x-->')
1475     # Line 2 is missing:
1476     # - " qq'ParseError',\n"
1477 wakaba 1.205 ok 929
1478     ok 930
1479     ok 931
1480     ok 932
1481     ok 933
1482     ok 934
1483     ok 935
1484     ok 936
1485     ok 937
1486 wakaba 1.281 ok 938
1487     ok 939
1488     ok 940
1489     ok 941
1490     ok 942
1491     ok 943
1492     ok 944
1493     ok 945
1494 wakaba 1.285 ok 946
1495 wakaba 1.205 ok 947
1496     ok 948
1497     ok 949
1498     ok 950
1499     ok 951
1500     ok 952
1501     ok 953
1502     ok 954
1503     ok 955
1504     ok 956
1505     ok 957
1506     ok 958
1507     ok 959
1508     ok 960
1509     ok 961
1510     ok 962
1511 wakaba 1.286 ok 963
1512     ok 964
1513 wakaba 1.290 ok 965
1514     ok 966
1515     ok 967
1516 wakaba 1.298 ok 968
1517     ok 969
1518     not ok 970
1519 wakaba 1.318 # Test 970 got: "$VAR1 = [\n qq'ParseError',\n qq'ParseError',\n [\n qq'Character',\n qq'\\x{D800}\\x{DFFF}'\n ]\n ];\n" (t/HTML-tokenizer.t at line 205 fail #970)
1520     # Expected: "$VAR1 = [\n qq'ParseError',\n [\n qq'Character',\n qq'\\x{D800}'\n ],\n qq'ParseError',\n [\n qq'Character',\n qq'\\x{DFFF}'\n ]\n ];\n" (surrogate character reference: qq'&#xD800;\x{DFFF}')
1521 wakaba 1.285 # Lines 3-3 are missing:
1522     # - " [\n"
1523     # - " qq'Character',\n"
1524 wakaba 1.318 # - " qq'\\x{D800}'\n"
1525 wakaba 1.285 # - " ],\n"
1526     # Line 6 is changed:
1527     # - " qq'\\x{DFFF}'\n"
1528 wakaba 1.318 # + " qq'\\x{D800}\\x{DFFF}'\n"
1529 wakaba 1.205 ok 971
1530     ok 972
1531     ok 973
1532     ok 974
1533     ok 975
1534     ok 976
1535     ok 977
1536     ok 978
1537     ok 979
1538     ok 980
1539     ok 981
1540     ok 982
1541     ok 983
1542     ok 984
1543     ok 985
1544     ok 986
1545     ok 987
1546     ok 988
1547     ok 989
1548     ok 990
1549     ok 991
1550     ok 992
1551     ok 993
1552     ok 994
1553     ok 995
1554     ok 996
1555     ok 997
1556     ok 998
1557     ok 999
1558     ok 1000
1559     ok 1001
1560     ok 1002
1561     ok 1003
1562     ok 1004
1563     ok 1005
1564     ok 1006
1565     ok 1007
1566     ok 1008
1567     ok 1009
1568     ok 1010
1569     ok 1011
1570     ok 1012
1571     ok 1013
1572     ok 1014
1573     ok 1015
1574     ok 1016
1575     ok 1017
1576     ok 1018
1577 wakaba 1.206 ok 1019
1578     ok 1020
1579 wakaba 1.312 not ok 1021
1580     # Test 1021 got: "$VAR1 = [\n qq'ParseError',\n qq'ParseError'\n ];\n" (t/HTML-tokenizer.t at line 205 fail #1021)
1581     # Expected: "$VAR1 = [\n qq'ParseError',\n qq'ParseError',\n [\n qq'StartTag',\n qq'a',\n {\n qq'href' => qq'\\x{A9}'\n }\n ]\n ];\n" (entity w/o refc at the end of unterminated attribute value: qq'<a href=\x{27}&COPY')
1582     # Line 3 is changed:
1583     # - " qq'ParseError',\n"
1584     # + " qq'ParseError'\n"
1585     # Lines 4-4 are missing:
1586     # - " [\n"
1587     # - " qq'StartTag',\n"
1588     # - " qq'a',\n"
1589     # - " {\n"
1590     # - " qq'href' => qq'\\x{A9}'\n"
1591     # - " }\n"
1592     # - " ]\n"
1593 wakaba 1.206 ok 1022
1594     ok 1023
1595     ok 1024
1596     ok 1025
1597 wakaba 1.240 ok 1026
1598 wakaba 1.206 ok 1027
1599     ok 1028
1600     ok 1029
1601 wakaba 1.240 ok 1030
1602 wakaba 1.206 ok 1031
1603     ok 1032
1604     ok 1033
1605 wakaba 1.240 ok 1034
1606 wakaba 1.206 ok 1035
1607     ok 1036
1608 wakaba 1.240 ok 1037
1609 wakaba 1.205 ok 1038
1610     ok 1039
1611 wakaba 1.298 ok 1040
1612     ok 1041
1613 wakaba 1.299 ok 1042
1614 wakaba 1.298 ok 1043
1615 wakaba 1.299 ok 1044
1616 wakaba 1.205 ok 1045
1617 wakaba 1.312 not ok 1046
1618     # Test 1046 got: "$VAR1 = [\n qq'ParseError'\n ];\n" (t/HTML-tokenizer.t at line 205 fail #1046)
1619     # Expected: "$VAR1 = [\n qq'ParseError',\n [\n qq'StartTag',\n qq'p',\n {\n qq'align' => qq'left<div>'\n }\n ]\n ];\n" (< in attribute value (single-unquoted) state: qq'<p align=\x{27}left<div>')
1620     # Line 2 is changed:
1621     # - " qq'ParseError',\n"
1622     # + " qq'ParseError'\n"
1623     # Lines 3-3 are missing:
1624     # - " [\n"
1625     # - " qq'StartTag',\n"
1626     # - " qq'p',\n"
1627     # - " {\n"
1628     # - " qq'align' => qq'left<div>'\n"
1629     # - " }\n"
1630     # - " ]\n"
1631 wakaba 1.205 ok 1047
1632     ok 1048
1633     ok 1049
1634     ok 1050
1635     ok 1051
1636     ok 1052
1637     ok 1053
1638     ok 1054
1639     ok 1055
1640     ok 1056
1641     ok 1057
1642     ok 1058
1643     ok 1059
1644     ok 1060
1645     ok 1061
1646 wakaba 1.206 ok 1062
1647     ok 1063
1648     ok 1064
1649     ok 1065
1650     ok 1066
1651     ok 1067
1652     ok 1068
1653 wakaba 1.227 ok 1069
1654     ok 1070
1655     ok 1071
1656     ok 1072
1657     ok 1073
1658 wakaba 1.247 ok 1074
1659     ok 1075
1660     ok 1076
1661     ok 1077
1662     ok 1078
1663     ok 1079
1664     ok 1080
1665 wakaba 1.281 ok 1081
1666     ok 1082
1667     ok 1083
1668     ok 1084
1669     ok 1085
1670     ok 1086
1671     ok 1087
1672     ok 1088
1673     ok 1089
1674     ok 1090
1675     ok 1091
1676     ok 1092
1677     ok 1093
1678     ok 1094
1679     ok 1095
1680     ok 1096
1681     ok 1097
1682 wakaba 1.285 ok 1098
1683     ok 1099
1684     ok 1100
1685     ok 1101
1686     ok 1102
1687     ok 1103
1688     ok 1104
1689     ok 1105
1690 wakaba 1.305 ok 1106
1691 wakaba 1.285 ok 1107
1692     ok 1108
1693     ok 1109
1694 wakaba 1.305 ok 1110
1695 wakaba 1.285 ok 1111
1696     ok 1112
1697     ok 1113
1698     ok 1114
1699     ok 1115
1700     ok 1116
1701     ok 1117
1702     ok 1118
1703     ok 1119
1704     ok 1120
1705     ok 1121
1706     ok 1122
1707     ok 1123
1708 wakaba 1.312 not ok 1124
1709     # Test 1124 got: "$VAR1 = [\n qq'ParseError'\n ];\n" (t/HTML-tokenizer.t at line 205 fail #1124)
1710     # Expected: "$VAR1 = [\n qq'ParseError',\n [\n qq'StartTag',\n qq'a',\n {\n qq'a' => qq'>'\n }\n ]\n ];\n" (<a a='>: qq'<a a=\x{27}>')
1711     # Line 2 is changed:
1712     # - " qq'ParseError',\n"
1713     # + " qq'ParseError'\n"
1714     # Lines 3-3 are missing:
1715     # - " [\n"
1716     # - " qq'StartTag',\n"
1717     # - " qq'a',\n"
1718     # - " {\n"
1719     # - " qq'a' => qq'>'\n"
1720     # - " }\n"
1721     # - " ]\n"
1722 wakaba 1.285 ok 1125
1723     ok 1126
1724     ok 1127
1725 wakaba 1.305 ok 1128
1726 wakaba 1.286 ok 1129
1727 wakaba 1.305 ok 1130
1728 wakaba 1.290 ok 1131
1729     ok 1132
1730 wakaba 1.293 ok 1133
1731     ok 1134
1732 wakaba 1.308 ok 1135
1733 wakaba 1.298 ok 1136
1734 wakaba 1.309 ok 1137
1735 wakaba 1.310 ok 1138
1736 wakaba 1.313 ok 1139
1737 wakaba 1.301 ok 1140
1738 wakaba 1.315 ok 1141
1739 wakaba 1.305 ok 1142
1740     ok 1143
1741 wakaba 1.302 ok 1144
1742     ok 1145
1743 wakaba 1.316 ok 1146
1744     ok 1147
1745 wakaba 1.317 ok 1148
1746     ok 1149
1747     ok 1150

admin@suikawiki.org
ViewVC Help
Powered by ViewVC 1.1.24