/[suikacvs]/markup/html/whatpm/t/tokenizer-result.txt
Suika

Contents of /markup/html/whatpm/t/tokenizer-result.txt

Parent Directory Parent Directory | Revision Log Revision Log


Revision 1.317 - (hide annotations) (download)
Sat Sep 5 10:38:11 2009 UTC (15 years, 10 months ago) by wakaba
Branch: MAIN
Changes since 1.316: +45 -41 lines
File MIME type: text/plain
updated

1 wakaba 1.287 1..1129
2 wakaba 1.273 # Running under perl version 5.010000 for linux
3 wakaba 1.317 # Current time local: Sat Sep 5 19:37:32 2009
4     # Current time GMT: Sat Sep 5 10:37:32 2009
5 wakaba 1.1 # Using Test.pm version 1.25
6 wakaba 1.11 # t/tokenizer/test1.test
7 wakaba 1.20 ok 1
8 wakaba 1.298 not ok 2
9     # Test 2 got: "$VAR1 = [\n [\n qq'DOCTYPE',\n qq'html',\n undef,\n undef,\n 1\n ]\n ];\n" (t/HTML-tokenizer.t at line 205 fail #2)
10     # Expected: "$VAR1 = [\n [\n qq'DOCTYPE',\n qq'HTML',\n undef,\n undef,\n 1\n ]\n ];\n" (Correct Doctype uppercase: qq'<!DOCTYPE HTML>')
11     # Line 4 is changed:
12     # - " qq'HTML',\n"
13     # + " qq'html',\n"
14     # t/HTML-tokenizer.t line 205 is: ok $parser_dump, $expected_dump,
15     not ok 3
16     # Test 3 got: "$VAR1 = [\n [\n qq'DOCTYPE',\n qq'html',\n undef,\n undef,\n 1\n ]\n ];\n" (t/HTML-tokenizer.t at line 205 fail #3)
17     # Expected: "$VAR1 = [\n [\n qq'DOCTYPE',\n qq'HtMl',\n undef,\n undef,\n 1\n ]\n ];\n" (Correct Doctype mixed case: qq'<!DOCTYPE HtMl>')
18     # Line 4 is changed:
19     # - " qq'HtMl',\n"
20     # + " qq'html',\n"
21 wakaba 1.1 ok 4
22 wakaba 1.20 ok 5
23 wakaba 1.1 ok 6
24     ok 7
25     ok 8
26     ok 9
27     ok 10
28     ok 11
29     ok 12
30     ok 13
31     ok 14
32 wakaba 1.130 ok 15
33 wakaba 1.1 ok 16
34     ok 17
35     ok 18
36 wakaba 1.296 not ok 19
37     # Test 19 got: "$VAR1 = [\n [\n qq'Comment',\n qq' --comment '\n ]\n ];\n" (t/HTML-tokenizer.t at line 205 fail #19)
38     # Expected: "$VAR1 = [\n qq'ParseError',\n [\n qq'Comment',\n qq' --comment '\n ]\n ];\n" (Comment, two central dashes: qq'<!-- --comment -->')
39     # Line 2 is missing:
40     # - " qq'ParseError',\n"
41 wakaba 1.1 ok 20
42     ok 21
43 wakaba 1.25 ok 22
44     ok 23
45 wakaba 1.1 ok 24
46 wakaba 1.22 ok 25
47     ok 26
48     ok 27
49 wakaba 1.1 ok 28
50     ok 29
51     ok 30
52     ok 31
53     ok 32
54     ok 33
55 wakaba 1.18 ok 34
56 wakaba 1.1 ok 35
57     ok 36
58     ok 37
59 wakaba 1.8 ok 38
60 wakaba 1.28 ok 39
61     ok 40
62 wakaba 1.43 ok 41
63     ok 42
64 wakaba 1.286 ok 43
65 wakaba 1.11 # t/tokenizer/test2.test
66 wakaba 1.286 not ok 44
67     # Test 44 got: "$VAR1 = [\n qq'ParseError',\n qq'ParseError',\n [\n qq'DOCTYPE',\n undef,\n undef,\n undef,\n 0\n ]\n ];\n" (t/HTML-tokenizer.t at line 205 fail #44)
68 wakaba 1.47 # Expected: "$VAR1 = [\n qq'ParseError',\n qq'ParseError',\n [\n qq'DOCTYPE',\n qq'',\n undef,\n undef,\n 0\n ]\n ];\n" (DOCTYPE without name: qq'<!DOCTYPE>')
69 wakaba 1.20 # Line 6 is changed:
70 wakaba 1.8 # - " qq'',\n"
71 wakaba 1.20 # + " undef,\n"
72     ok 45
73     ok 46
74     ok 47
75     ok 48
76     ok 49
77     ok 50
78     ok 51
79 wakaba 1.97 ok 52
80     ok 53
81     ok 54
82     ok 55
83 wakaba 1.9 ok 56
84     ok 57
85 wakaba 1.1 ok 58
86     ok 59
87     ok 60
88 wakaba 1.19 ok 61
89 wakaba 1.1 ok 62
90     ok 63
91 wakaba 1.130 ok 64
92 wakaba 1.1 ok 65
93 wakaba 1.240 ok 66
94     ok 67
95     ok 68
96 wakaba 1.1 ok 69
97     ok 70
98 wakaba 1.34 ok 71
99     ok 72
100 wakaba 1.1 ok 73
101     ok 74
102 wakaba 1.21 ok 75
103     ok 76
104 wakaba 1.1 ok 77
105 wakaba 1.141 ok 78
106 wakaba 1.1 ok 79
107 wakaba 1.305 ok 80
108 wakaba 1.34 ok 81
109 wakaba 1.286 # t/tokenizer/test3.test
110 wakaba 1.15 ok 82
111 wakaba 1.1 ok 83
112     ok 84
113 wakaba 1.25 ok 85
114     ok 86
115 wakaba 1.34 ok 87
116 wakaba 1.1 ok 88
117     ok 89
118     ok 90
119     ok 91
120 wakaba 1.296 not ok 92
121     # Test 92 got: "$VAR1 = [\n qq'ParseError',\n [\n qq'Comment',\n qq'--.'\n ]\n ];\n" (t/HTML-tokenizer.t at line 205 fail #92)
122     # Expected: "$VAR1 = [\n qq'ParseError',\n qq'ParseError',\n [\n qq'Comment',\n qq'--.'\n ]\n ];\n" (<!----.: qq'<!----.')
123     # Line 3 is missing:
124     # - " qq'ParseError',\n"
125 wakaba 1.1 ok 93
126     ok 94
127 wakaba 1.8 ok 95
128     ok 96
129     ok 97
130     ok 98
131     ok 99
132     ok 100
133 wakaba 1.96 ok 101
134     ok 102
135     ok 103
136     ok 104
137 wakaba 1.141 ok 105
138 wakaba 1.286 ok 106
139     ok 107
140     ok 108
141     not ok 109
142     # Test 109 got: "$VAR1 = [\n qq'ParseError',\n [\n qq'DOCTYPE',\n undef,\n undef,\n undef,\n 0\n ]\n ];\n" (t/HTML-tokenizer.t at line 205 fail #109)
143 wakaba 1.47 # Expected: "$VAR1 = [\n qq'ParseError',\n [\n qq'DOCTYPE',\n qq'',\n undef,\n undef,\n 0\n ]\n ];\n" (<!doctype >: qq'<!doctype >')
144 wakaba 1.43 # Line 5 is changed:
145     # - " qq'',\n"
146     # + " undef,\n"
147 wakaba 1.286 not ok 110
148     # Test 110 got: "$VAR1 = [\n qq'ParseError',\n [\n qq'DOCTYPE',\n undef,\n undef,\n undef,\n 0\n ]\n ];\n" (t/HTML-tokenizer.t at line 205 fail #110)
149 wakaba 1.47 # Expected: "$VAR1 = [\n qq'ParseError',\n [\n qq'DOCTYPE',\n qq'',\n undef,\n undef,\n 0\n ]\n ];\n" (<!doctype : qq'<!doctype ')
150 wakaba 1.43 # Line 5 is changed:
151     # - " qq'',\n"
152     # + " undef,\n"
153 wakaba 1.8 ok 111
154     ok 112
155     ok 113
156 wakaba 1.10 ok 114
157 wakaba 1.287 ok 115
158 wakaba 1.10 ok 116
159     ok 117
160     ok 118
161 wakaba 1.287 ok 119
162 wakaba 1.10 ok 120
163     ok 121
164 wakaba 1.39 ok 122
165 wakaba 1.18 ok 123
166 wakaba 1.287 ok 124
167 wakaba 1.18 ok 125
168     ok 126
169 wakaba 1.20 ok 127
170 wakaba 1.240 ok 128
171 wakaba 1.20 ok 129
172 wakaba 1.287 ok 130
173 wakaba 1.240 ok 131
174 wakaba 1.20 ok 132
175     ok 133
176     ok 134
177 wakaba 1.287 ok 135
178 wakaba 1.20 ok 136
179 wakaba 1.303 not ok 137
180     # Test 137 got: "$VAR1 = [\n qq'ParseError'\n ];\n" (t/HTML-tokenizer.t at line 205 fail #137)
181     # Expected: "$VAR1 = [\n qq'ParseError',\n [\n qq'EndTag',\n qq'z'\n ]\n ];\n" (</z: qq'</z')
182     # Line 2 is changed:
183     # - " qq'ParseError',\n"
184     # + " qq'ParseError'\n"
185     # Lines 3-3 are missing:
186     # - " [\n"
187     # - " qq'EndTag',\n"
188     # - " qq'z'\n"
189     # - " ]\n"
190 wakaba 1.21 ok 138
191 wakaba 1.306 not ok 139
192     # Test 139 got: "$VAR1 = [\n qq'ParseError'\n ];\n" (t/HTML-tokenizer.t at line 205 fail #139)
193     # Expected: "$VAR1 = [\n qq'ParseError',\n [\n qq'StartTag',\n qq'z',\n {}\n ]\n ];\n" (<z : qq'<z ')
194     # Line 2 is changed:
195     # - " qq'ParseError',\n"
196     # + " qq'ParseError'\n"
197     # Lines 3-3 are missing:
198     # - " [\n"
199     # - " qq'StartTag',\n"
200     # - " qq'z',\n"
201     # - " {}\n"
202     # - " ]\n"
203 wakaba 1.20 ok 140
204 wakaba 1.306 not ok 141
205     # Test 141 got: "$VAR1 = [\n qq'ParseError',\n qq'ParseError'\n ];\n" (t/HTML-tokenizer.t at line 205 fail #141)
206     # Expected: "$VAR1 = [\n qq'ParseError',\n qq'ParseError',\n [\n qq'StartTag',\n qq'z',\n {}\n ]\n ];\n" (<z/ : qq'<z/ ')
207     # Line 3 is changed:
208     # - " qq'ParseError',\n"
209     # + " qq'ParseError'\n"
210     # Lines 4-4 are missing:
211     # - " [\n"
212     # - " qq'StartTag',\n"
213     # - " qq'z',\n"
214     # - " {}\n"
215     # - " ]\n"
216 wakaba 1.317 not ok 142
217     # Test 142 got: "$VAR1 = [\n qq'ParseError',\n qq'ParseError'\n ];\n" (t/HTML-tokenizer.t at line 205 fail #142)
218     # Expected: "$VAR1 = [\n qq'ParseError',\n qq'ParseError',\n [\n qq'StartTag',\n qq'z',\n {}\n ]\n ];\n" (<z//: qq'<z//')
219     # Line 3 is changed:
220     # - " qq'ParseError',\n"
221     # + " qq'ParseError'\n"
222     # Lines 4-4 are missing:
223     # - " [\n"
224     # - " qq'StartTag',\n"
225     # - " qq'z',\n"
226     # - " {}\n"
227     # - " ]\n"
228 wakaba 1.303 not ok 143
229     # Test 143 got: "$VAR1 = [\n qq'ParseError'\n ];\n" (t/HTML-tokenizer.t at line 205 fail #143)
230     # Expected: "$VAR1 = [\n qq'ParseError',\n [\n qq'StartTag',\n qq'z',\n {}\n ]\n ];\n" (<z: qq'<z')
231     # Line 2 is changed:
232     # - " qq'ParseError',\n"
233     # + " qq'ParseError'\n"
234     # Lines 3-3 are missing:
235     # - " [\n"
236     # - " qq'StartTag',\n"
237     # - " qq'z',\n"
238     # - " {}\n"
239     # - " ]\n"
240     not ok 144
241     # Test 144 got: "$VAR1 = [\n qq'ParseError'\n ];\n" (t/HTML-tokenizer.t at line 205 fail #144)
242     # Expected: "$VAR1 = [\n qq'ParseError',\n [\n qq'EndTag',\n qq'z'\n ]\n ];\n" (</z: qq'</z')
243     # Line 2 is changed:
244     # - " qq'ParseError',\n"
245     # + " qq'ParseError'\n"
246     # Lines 3-3 are missing:
247     # - " [\n"
248     # - " qq'EndTag',\n"
249     # - " qq'z'\n"
250     # - " ]\n"
251     not ok 145
252     # Test 145 got: "$VAR1 = [\n qq'ParseError'\n ];\n" (t/HTML-tokenizer.t at line 205 fail #145)
253     # Expected: "$VAR1 = [\n qq'ParseError',\n [\n qq'StartTag',\n qq'z0',\n {}\n ]\n ];\n" (<z0: qq'<z0')
254     # Line 2 is changed:
255     # - " qq'ParseError',\n"
256     # + " qq'ParseError'\n"
257     # Lines 3-3 are missing:
258     # - " [\n"
259     # - " qq'StartTag',\n"
260     # - " qq'z0',\n"
261     # - " {}\n"
262     # - " ]\n"
263 wakaba 1.286 not ok 146
264     # Test 146 got: "$VAR1 = [\n qq'ParseError',\n qq'ParseError',\n [\n qq'StartTag',\n qq'z',\n {\n 0 => qq''\n }\n ]\n ];\n" (t/HTML-tokenizer.t at line 205 fail #146)
265 wakaba 1.247 # Expected: "$VAR1 = [\n qq'ParseError',\n [\n qq'StartTag',\n qq'z',\n {\n 0 => qq''\n }\n ]\n ];\n" (<z/0=>: qq'<z/0=>')
266     # Got 1 extra line at line 3:
267     # + " qq'ParseError',\n"
268 wakaba 1.309 not ok 147
269     # Test 147 got: "$VAR1 = [\n qq'ParseError',\n qq'ParseError'\n ];\n" (t/HTML-tokenizer.t at line 205 fail #147)
270     # Expected: "$VAR1 = [\n qq'ParseError',\n qq'ParseError',\n [\n qq'StartTag',\n qq'z',\n {\n 0 => qq''\n }\n ]\n ];\n" (<z/0= : qq'<z/0= ')
271     # Line 3 is changed:
272     # - " qq'ParseError',\n"
273     # + " qq'ParseError'\n"
274     # Lines 4-4 are missing:
275     # - " [\n"
276     # - " qq'StartTag',\n"
277     # - " qq'z',\n"
278     # - " {\n"
279     # - " 0 => qq''\n"
280     # - " }\n"
281     # - " ]\n"
282 wakaba 1.239 ok 148
283 wakaba 1.306 not ok 149
284     # Test 149 got: "$VAR1 = [\n qq'ParseError',\n qq'ParseError'\n ];\n" (t/HTML-tokenizer.t at line 205 fail #149)
285     # Expected: "$VAR1 = [\n qq'ParseError',\n qq'ParseError',\n [\n qq'StartTag',\n qq'z',\n {\n 0 => qq'?'\n }\n ]\n ];\n" (<z/0=? : qq'<z/0=? ')
286     # Line 3 is changed:
287     # - " qq'ParseError',\n"
288     # + " qq'ParseError'\n"
289     # Lines 4-4 are missing:
290     # - " [\n"
291     # - " qq'StartTag',\n"
292     # - " qq'z',\n"
293     # - " {\n"
294     # - " 0 => qq'?'\n"
295     # - " }\n"
296     # - " ]\n"
297 wakaba 1.314 not ok 150
298     # Test 150 got: "$VAR1 = [\n qq'ParseError',\n qq'ParseError'\n ];\n" (t/HTML-tokenizer.t at line 205 fail #150)
299     # Expected: "$VAR1 = [\n qq'ParseError',\n qq'ParseError',\n [\n qq'StartTag',\n qq'z',\n {\n 0 => qq'??'\n }\n ]\n ];\n" (<z/0=??: qq'<z/0=??')
300     # Line 3 is changed:
301     # - " qq'ParseError',\n"
302     # + " qq'ParseError'\n"
303     # Lines 4-4 are missing:
304     # - " [\n"
305     # - " qq'StartTag',\n"
306     # - " qq'z',\n"
307     # - " {\n"
308     # - " 0 => qq'??'\n"
309     # - " }\n"
310     # - " ]\n"
311 wakaba 1.316 not ok 151
312     # Test 151 got: "$VAR1 = [\n qq'ParseError',\n qq'ParseError'\n ];\n" (t/HTML-tokenizer.t at line 205 fail #151)
313     # Expected: "$VAR1 = [\n qq'ParseError',\n qq'ParseError',\n [\n qq'StartTag',\n qq'z',\n {\n 0 => qq''\n }\n ]\n ];\n" (<z/0='': qq'<z/0=\x{27}\x{27}')
314     # Line 3 is changed:
315     # - " qq'ParseError',\n"
316     # + " qq'ParseError'\n"
317     # Lines 4-4 are missing:
318     # - " [\n"
319     # - " qq'StartTag',\n"
320     # - " qq'z',\n"
321     # - " {\n"
322     # - " 0 => qq''\n"
323     # - " }\n"
324     # - " ]\n"
325 wakaba 1.312 not ok 152
326     # Test 152 got: "$VAR1 = [\n qq'ParseError',\n qq'ParseError'\n ];\n" (t/HTML-tokenizer.t at line 205 fail #152)
327     # Expected: "$VAR1 = [\n qq'ParseError',\n qq'ParseError',\n [\n qq'StartTag',\n qq'z',\n {\n 0 => qq'&'\n }\n ]\n ];\n" (<z/0='&: qq'<z/0=\x{27}&')
328     # Line 3 is changed:
329     # - " qq'ParseError',\n"
330     # + " qq'ParseError'\n"
331     # Lines 4-4 are missing:
332     # - " [\n"
333     # - " qq'StartTag',\n"
334     # - " qq'z',\n"
335     # - " {\n"
336     # - " 0 => qq'&'\n"
337     # - " }\n"
338     # - " ]\n"
339     not ok 153
340     # Test 153 got: "$VAR1 = [\n qq'ParseError',\n qq'ParseError'\n ];\n" (t/HTML-tokenizer.t at line 205 fail #153)
341     # Expected: "$VAR1 = [\n qq'ParseError',\n qq'ParseError',\n [\n qq'StartTag',\n qq'z',\n {\n 0 => qq'%'\n }\n ]\n ];\n" (<z/0='%: qq'<z/0=\x{27}%')
342     # Line 3 is changed:
343     # - " qq'ParseError',\n"
344     # + " qq'ParseError'\n"
345     # Lines 4-4 are missing:
346     # - " [\n"
347     # - " qq'StartTag',\n"
348     # - " qq'z',\n"
349     # - " {\n"
350     # - " 0 => qq'%'\n"
351     # - " }\n"
352     # - " ]\n"
353 wakaba 1.22 ok 154
354 wakaba 1.316 not ok 155
355     # Test 155 got: "$VAR1 = [\n qq'ParseError',\n qq'ParseError'\n ];\n" (t/HTML-tokenizer.t at line 205 fail #155)
356     # Expected: "$VAR1 = [\n qq'ParseError',\n qq'ParseError',\n [\n qq'StartTag',\n qq'z',\n {\n 0 => qq''\n }\n ]\n ];\n" (<z/0="": qq'<z/0=""')
357     # Line 3 is changed:
358     # - " qq'ParseError',\n"
359     # + " qq'ParseError'\n"
360     # Lines 4-4 are missing:
361     # - " [\n"
362     # - " qq'StartTag',\n"
363     # - " qq'z',\n"
364     # - " {\n"
365     # - " 0 => qq''\n"
366     # - " }\n"
367     # - " ]\n"
368 wakaba 1.22 ok 156
369 wakaba 1.314 not ok 157
370     # Test 157 got: "$VAR1 = [\n qq'ParseError',\n qq'ParseError'\n ];\n" (t/HTML-tokenizer.t at line 205 fail #157)
371     # Expected: "$VAR1 = [\n qq'ParseError',\n qq'ParseError',\n [\n qq'StartTag',\n qq'z',\n {\n 0 => qq'&'\n }\n ]\n ];\n" (<z/0=&: qq'<z/0=&')
372     # Line 3 is changed:
373     # - " qq'ParseError',\n"
374     # + " qq'ParseError'\n"
375     # Lines 4-4 are missing:
376     # - " [\n"
377     # - " qq'StartTag',\n"
378     # - " qq'z',\n"
379     # - " {\n"
380     # - " 0 => qq'&'\n"
381     # - " }\n"
382     # - " ]\n"
383 wakaba 1.28 ok 158
384 wakaba 1.309 not ok 159
385     # Test 159 got: "$VAR1 = [\n qq'ParseError',\n qq'ParseError'\n ];\n" (t/HTML-tokenizer.t at line 205 fail #159)
386     # Expected: "$VAR1 = [\n qq'ParseError',\n qq'ParseError',\n [\n qq'StartTag',\n qq'z',\n {\n 0 => qq''\n }\n ]\n ];\n" (<z/0 =: qq'<z/0 =')
387     # Line 3 is changed:
388     # - " qq'ParseError',\n"
389     # + " qq'ParseError'\n"
390     # Lines 4-4 are missing:
391     # - " [\n"
392     # - " qq'StartTag',\n"
393     # - " qq'z',\n"
394     # - " {\n"
395     # - " 0 => qq''\n"
396     # - " }\n"
397     # - " ]\n"
398 wakaba 1.239 ok 160
399 wakaba 1.308 not ok 161
400     # Test 161 got: "$VAR1 = [\n qq'ParseError',\n qq'ParseError'\n ];\n" (t/HTML-tokenizer.t at line 205 fail #161)
401     # Expected: "$VAR1 = [\n qq'ParseError',\n qq'ParseError',\n [\n qq'StartTag',\n qq'z',\n {\n 0 => qq''\n }\n ]\n ];\n" (<z/0 : qq'<z/0 ')
402     # Line 3 is changed:
403     # - " qq'ParseError',\n"
404     # + " qq'ParseError'\n"
405     # Lines 4-4 are missing:
406     # - " [\n"
407     # - " qq'StartTag',\n"
408     # - " qq'z',\n"
409     # - " {\n"
410     # - " 0 => qq''\n"
411     # - " }\n"
412     # - " ]\n"
413 wakaba 1.317 not ok 162
414     # Test 162 got: "$VAR1 = [\n qq'ParseError',\n qq'ParseError'\n ];\n" (t/HTML-tokenizer.t at line 205 fail #162)
415     # Expected: "$VAR1 = [\n qq'ParseError',\n qq'ParseError',\n [\n qq'StartTag',\n qq'z',\n {\n 0 => qq''\n }\n ]\n ];\n" (<z/0 /: qq'<z/0 /')
416     # Line 3 is changed:
417     # - " qq'ParseError',\n"
418     # + " qq'ParseError'\n"
419     # Lines 4-4 are missing:
420     # - " [\n"
421     # - " qq'StartTag',\n"
422     # - " qq'z',\n"
423     # - " {\n"
424     # - " 0 => qq''\n"
425     # - " }\n"
426     # - " ]\n"
427     not ok 163
428     # Test 163 got: "$VAR1 = [\n qq'ParseError',\n qq'ParseError'\n ];\n" (t/HTML-tokenizer.t at line 205 fail #163)
429     # Expected: "$VAR1 = [\n qq'ParseError',\n qq'ParseError',\n [\n qq'StartTag',\n qq'z',\n {\n 0 => qq''\n }\n ]\n ];\n" (<z/0/: qq'<z/0/')
430     # Line 3 is changed:
431     # - " qq'ParseError',\n"
432     # + " qq'ParseError'\n"
433     # Lines 4-4 are missing:
434     # - " [\n"
435     # - " qq'StartTag',\n"
436     # - " qq'z',\n"
437     # - " {\n"
438     # - " 0 => qq''\n"
439     # - " }\n"
440     # - " ]\n"
441 wakaba 1.307 not ok 164
442     # Test 164 got: "$VAR1 = [\n qq'ParseError',\n qq'ParseError'\n ];\n" (t/HTML-tokenizer.t at line 205 fail #164)
443     # Expected: "$VAR1 = [\n qq'ParseError',\n qq'ParseError',\n [\n qq'StartTag',\n qq'z',\n {\n qq'00' => qq''\n }\n ]\n ];\n" (<z/00: qq'<z/00')
444     # Line 3 is changed:
445     # - " qq'ParseError',\n"
446     # + " qq'ParseError'\n"
447     # Lines 4-4 are missing:
448     # - " [\n"
449     # - " qq'StartTag',\n"
450     # - " qq'z',\n"
451     # - " {\n"
452     # - " qq'00' => qq''\n"
453     # - " }\n"
454     # - " ]\n"
455     not ok 165
456     # Test 165 got: "$VAR1 = [\n qq'ParseError',\n qq'ParseError',\n qq'ParseError'\n ];\n" (t/HTML-tokenizer.t at line 205 fail #165)
457     # Expected: "$VAR1 = [\n qq'ParseError',\n qq'ParseError',\n qq'ParseError',\n [\n qq'StartTag',\n qq'z',\n {\n 0 => qq''\n }\n ]\n ];\n" (<z/0 0: qq'<z/0 0')
458     # Line 4 is changed:
459     # - " qq'ParseError',\n"
460     # + " qq'ParseError'\n"
461     # Lines 5-5 are missing:
462     # - " [\n"
463     # - " qq'StartTag',\n"
464     # - " qq'z',\n"
465     # - " {\n"
466     # - " 0 => qq''\n"
467     # - " }\n"
468     # - " ]\n"
469 wakaba 1.312 not ok 166
470     # Test 166 got: "$VAR1 = [\n qq'ParseError',\n qq'ParseError',\n qq'ParseError'\n ];\n" (t/HTML-tokenizer.t at line 205 fail #166)
471     # Expected: "$VAR1 = [\n qq'ParseError',\n qq'ParseError',\n qq'ParseError',\n [\n qq'StartTag',\n qq'z',\n {\n 0 => qq'\\x{09}'\n }\n ]\n ];\n" (<z/0='&#9: qq'<z/0=\x{27}&#9')
472     # Line 4 is changed:
473     # - " qq'ParseError',\n"
474     # + " qq'ParseError'\n"
475     # Lines 5-5 are missing:
476     # - " [\n"
477     # - " qq'StartTag',\n"
478     # - " qq'z',\n"
479     # - " {\n"
480     # - " 0 => qq'\\x{09}'\n"
481     # - " }\n"
482     # - " ]\n"
483 wakaba 1.28 ok 167
484 wakaba 1.314 not ok 168
485     # Test 168 got: "$VAR1 = [\n qq'ParseError',\n qq'ParseError',\n qq'ParseError'\n ];\n" (t/HTML-tokenizer.t at line 205 fail #168)
486     # Expected: "$VAR1 = [\n qq'ParseError',\n qq'ParseError',\n qq'ParseError',\n [\n qq'StartTag',\n qq'z',\n {\n 0 => qq'\\x{09}'\n }\n ]\n ];\n" (<z/0=&#9: qq'<z/0=&#9')
487     # Line 4 is changed:
488     # - " qq'ParseError',\n"
489     # + " qq'ParseError'\n"
490     # Lines 5-5 are missing:
491     # - " [\n"
492     # - " qq'StartTag',\n"
493     # - " qq'z',\n"
494     # - " {\n"
495     # - " 0 => qq'\\x{09}'\n"
496     # - " }\n"
497     # - " ]\n"
498 wakaba 1.307 not ok 169
499     # Test 169 got: "$VAR1 = [\n qq'ParseError',\n qq'ParseError'\n ];\n" (t/HTML-tokenizer.t at line 205 fail #169)
500     # Expected: "$VAR1 = [\n qq'ParseError',\n qq'ParseError',\n [\n qq'StartTag',\n qq'z',\n {\n qq'0z' => qq''\n }\n ]\n ];\n" (<z/0z: qq'<z/0z')
501     # Line 3 is changed:
502     # - " qq'ParseError',\n"
503     # + " qq'ParseError'\n"
504     # Lines 4-4 are missing:
505     # - " [\n"
506     # - " qq'StartTag',\n"
507     # - " qq'z',\n"
508     # - " {\n"
509     # - " qq'0z' => qq''\n"
510     # - " }\n"
511     # - " ]\n"
512     not ok 170
513     # Test 170 got: "$VAR1 = [\n qq'ParseError',\n qq'ParseError'\n ];\n" (t/HTML-tokenizer.t at line 205 fail #170)
514     # Expected: "$VAR1 = [\n qq'ParseError',\n qq'ParseError',\n [\n qq'StartTag',\n qq'z',\n {\n 0 => qq'',\n qq'z' => qq''\n }\n ]\n ];\n" (<z/0 z: qq'<z/0 z')
515     # Line 3 is changed:
516     # - " qq'ParseError',\n"
517     # + " qq'ParseError'\n"
518     # Lines 4-4 are missing:
519     # - " [\n"
520     # - " qq'StartTag',\n"
521     # - " qq'z',\n"
522     # - " {\n"
523     # - " 0 => qq'',\n"
524     # - " qq'z' => qq''\n"
525     # - " }\n"
526     # - " ]\n"
527 wakaba 1.303 not ok 171
528     # Test 171 got: "$VAR1 = [\n qq'ParseError'\n ];\n" (t/HTML-tokenizer.t at line 205 fail #171)
529     # Expected: "$VAR1 = [\n qq'ParseError',\n [\n qq'StartTag',\n qq'zz',\n {}\n ]\n ];\n" (<zz: qq'<zz')
530     # Line 2 is changed:
531     # - " qq'ParseError',\n"
532     # + " qq'ParseError'\n"
533     # Lines 3-3 are missing:
534     # - " [\n"
535     # - " qq'StartTag',\n"
536     # - " qq'zz',\n"
537     # - " {}\n"
538     # - " ]\n"
539 wakaba 1.307 not ok 172
540     # Test 172 got: "$VAR1 = [\n qq'ParseError',\n qq'ParseError'\n ];\n" (t/HTML-tokenizer.t at line 205 fail #172)
541     # Expected: "$VAR1 = [\n qq'ParseError',\n qq'ParseError',\n [\n qq'StartTag',\n qq'z',\n {\n qq'z' => qq''\n }\n ]\n ];\n" (<z/z: qq'<z/z')
542     # Line 3 is changed:
543     # - " qq'ParseError',\n"
544     # + " qq'ParseError'\n"
545     # Lines 4-4 are missing:
546     # - " [\n"
547     # - " qq'StartTag',\n"
548     # - " qq'z',\n"
549     # - " {\n"
550     # - " qq'z' => qq''\n"
551     # - " }\n"
552     # - " ]\n"
553 wakaba 1.286 # t/tokenizer/test4.test
554 wakaba 1.299 not ok 173
555 wakaba 1.307 # Test 173 got: "$VAR1 = [\n qq'ParseError',\n qq'ParseError',\n qq'ParseError'\n ];\n" (t/HTML-tokenizer.t at line 205 fail #173)
556 wakaba 1.299 # Expected: "$VAR1 = [\n qq'ParseError',\n qq'ParseError',\n [\n qq'StartTag',\n qq'z',\n {\n 0 => qq'',\n qq'<' => qq''\n }\n ]\n ];\n" (< in attribute name: qq'<z/0 <')
557 wakaba 1.307 # Line 4 is changed:
558     # - " [\n"
559     # + " qq'ParseError'\n"
560     # Lines 5-5 are missing:
561     # - " qq'StartTag',\n"
562     # - " qq'z',\n"
563     # - " {\n"
564     # - " 0 => qq'',\n"
565     # - " qq'<' => qq''\n"
566     # - " }\n"
567     # - " ]\n"
568 wakaba 1.293 not ok 174
569 wakaba 1.314 # Test 174 got: "$VAR1 = [\n qq'ParseError',\n qq'ParseError'\n ];\n" (t/HTML-tokenizer.t at line 205 fail #174)
570 wakaba 1.293 # Expected: "$VAR1 = [\n qq'ParseError',\n [\n qq'StartTag',\n qq'z',\n {\n qq'x' => qq'<'\n }\n ]\n ];\n" (< in attribute value: qq'<z x=<')
571 wakaba 1.314 # Line 3 is changed:
572     # - " [\n"
573     # + " qq'ParseError'\n"
574     # Lines 4-4 are missing:
575     # - " qq'StartTag',\n"
576     # - " qq'z',\n"
577     # - " {\n"
578     # - " qq'x' => qq'<'\n"
579     # - " }\n"
580     # - " ]\n"
581 wakaba 1.286 ok 175
582     ok 176
583     not ok 177
584     # Test 177 got: "$VAR1 = [\n qq'ParseError',\n qq'ParseError',\n [\n qq'StartTag',\n qq'z',\n {\n qq'=' => qq''\n }\n ]\n ];\n" (t/HTML-tokenizer.t at line 205 fail #177)
585 wakaba 1.247 # Expected: "$VAR1 = [\n qq'ParseError',\n [\n qq'StartTag',\n qq'z',\n {\n qq'=' => qq''\n }\n ]\n ];\n" (== attribute: qq'<z ==>')
586     # Got 1 extra line at line 3:
587     # + " qq'ParseError',\n"
588 wakaba 1.28 ok 178
589 wakaba 1.33 ok 179
590 wakaba 1.34 ok 180
591 wakaba 1.38 ok 181
592     ok 182
593 wakaba 1.43 ok 183
594     ok 184
595     ok 185
596     ok 186
597     ok 187
598     ok 188
599 wakaba 1.240 ok 189
600     ok 190
601 wakaba 1.43 ok 191
602     ok 192
603     ok 193
604     ok 194
605     ok 195
606     ok 196
607 wakaba 1.306 not ok 197
608     # Test 197 got: "$VAR1 = [\n qq'ParseError'\n ];\n" (t/HTML-tokenizer.t at line 205 fail #197)
609     # Expected: "$VAR1 = [\n qq'ParseError',\n [\n qq'StartTag',\n qq'z',\n {}\n ]\n ];\n" (CR EOF in tag name: qq'<z\x{0D}')
610     # Line 2 is changed:
611     # - " qq'ParseError',\n"
612     # + " qq'ParseError'\n"
613     # Lines 3-3 are missing:
614     # - " [\n"
615     # - " qq'StartTag',\n"
616     # - " qq'z',\n"
617     # - " {}\n"
618     # - " ]\n"
619 wakaba 1.96 ok 198
620     ok 199
621 wakaba 1.286 ok 200
622 wakaba 1.96 ok 201
623 wakaba 1.130 ok 202
624 wakaba 1.43 ok 203
625     ok 204
626     ok 205
627     ok 206
628     ok 207
629     ok 208
630     ok 209
631     ok 210
632     ok 211
633     ok 212
634     ok 213
635     ok 214
636 wakaba 1.240 ok 215
637     ok 216
638 wakaba 1.43 ok 217
639     ok 218
640     ok 219
641     ok 220
642 wakaba 1.141 ok 221
643 wakaba 1.286 ok 222
644 wakaba 1.298 not ok 223
645     # Test 223 got: "$VAR1 = [\n [\n qq'DOCTYPE',\n qq'html',\n qq'AbC',\n qq'XyZ',\n 1\n ]\n ];\n" (t/HTML-tokenizer.t at line 205 fail #223)
646     # Expected: "$VAR1 = [\n [\n qq'DOCTYPE',\n qq'HtMl',\n qq'AbC',\n qq'XyZ',\n 1\n ]\n ];\n" (Doctype public case-sensitivity (1): qq'<!DoCtYpE HtMl PuBlIc "AbC" "XyZ">')
647     # Line 4 is changed:
648     # - " qq'HtMl',\n"
649     # + " qq'html',\n"
650     not ok 224
651     # Test 224 got: "$VAR1 = [\n [\n qq'DOCTYPE',\n qq'html',\n qq'aBc',\n qq'xYz',\n 1\n ]\n ];\n" (t/HTML-tokenizer.t at line 205 fail #224)
652     # Expected: "$VAR1 = [\n [\n qq'DOCTYPE',\n qq'hTmL',\n qq'aBc',\n qq'xYz',\n 1\n ]\n ];\n" (Doctype public case-sensitivity (2): qq'<!dOcTyPe hTmL pUbLiC "aBc" "xYz">')
653     # Line 4 is changed:
654     # - " qq'hTmL',\n"
655     # + " qq'html',\n"
656     not ok 225
657     # Test 225 got: "$VAR1 = [\n [\n qq'DOCTYPE',\n qq'html',\n undef,\n qq'XyZ',\n 1\n ]\n ];\n" (t/HTML-tokenizer.t at line 205 fail #225)
658     # Expected: "$VAR1 = [\n [\n qq'DOCTYPE',\n qq'HtMl',\n undef,\n qq'XyZ',\n 1\n ]\n ];\n" (Doctype system case-sensitivity (1): qq'<!DoCtYpE HtMl SyStEm "XyZ">')
659     # Line 4 is changed:
660     # - " qq'HtMl',\n"
661     # + " qq'html',\n"
662     not ok 226
663     # Test 226 got: "$VAR1 = [\n [\n qq'DOCTYPE',\n qq'html',\n undef,\n qq'xYz',\n 1\n ]\n ];\n" (t/HTML-tokenizer.t at line 205 fail #226)
664     # Expected: "$VAR1 = [\n [\n qq'DOCTYPE',\n qq'hTmL',\n undef,\n qq'xYz',\n 1\n ]\n ];\n" (Doctype system case-sensitivity (2): qq'<!dOcTyPe hTmL sYsTeM "xYz">')
665     # Line 4 is changed:
666     # - " qq'hTmL',\n"
667     # + " qq'html',\n"
668 wakaba 1.286 not ok 227
669     # Test 227 got: "$VAR1 = [\n qq'ParseError',\n qq'ParseError',\n [\n qq'Comment',\n qq'doc'\n ],\n [\n qq'Character',\n qq'\\x{FFFD}'\n ]\n ];\n" (t/HTML-tokenizer.t at line 205 fail #227)
670 wakaba 1.130 # Expected: "$VAR1 = [\n qq'ParseError',\n [\n qq'Comment',\n qq'doc'\n ],\n qq'ParseError',\n [\n qq'Character',\n qq'\\x{FFFD}'\n ]\n ];\n" (U+0000 in lookahead region after non-matching character: qq'<!doc>\x{00}')
671     # Got 1 extra line at line 3:
672     # + " qq'ParseError',\n"
673     # Line 8 is missing:
674     # - " qq'ParseError',\n"
675 wakaba 1.43 ok 228
676     ok 229
677     ok 230
678     ok 231
679     ok 232
680     ok 233
681     ok 234
682     ok 235
683 wakaba 1.141 ok 236
684 wakaba 1.43 ok 237
685     ok 238
686     ok 239
687     ok 240
688     ok 241
689 wakaba 1.287 ok 242
690 wakaba 1.43 ok 243
691 wakaba 1.287 ok 244
692 wakaba 1.286 # t/tokenizer/contentModelFlags.test
693 wakaba 1.43 ok 245
694     ok 246
695     ok 247
696     ok 248
697 wakaba 1.141 ok 249
698 wakaba 1.43 ok 250
699     ok 251
700     ok 252
701     ok 253
702     ok 254
703     ok 255
704 wakaba 1.141 ok 256
705 wakaba 1.43 ok 257
706 wakaba 1.286 # t/tokenizer/escapeFlag.test
707 wakaba 1.43 ok 258
708     ok 259
709     ok 260
710     ok 261
711     ok 262
712 wakaba 1.206 ok 263
713 wakaba 1.43 ok 264
714     ok 265
715     ok 266
716 wakaba 1.286 # t/tokenizer/entities.test
717 wakaba 1.43 ok 267
718     ok 268
719     ok 269
720     ok 270
721     ok 271
722     ok 272
723     ok 273
724     ok 274
725     ok 275
726     ok 276
727     ok 277
728     ok 278
729     ok 279
730     ok 280
731     ok 281
732     ok 282
733     ok 283
734     ok 284
735     ok 285
736     ok 286
737     ok 287
738     ok 288
739     ok 289
740     ok 290
741     ok 291
742     ok 292
743     ok 293
744     ok 294
745     ok 295
746     ok 296
747     ok 297
748     ok 298
749     ok 299
750     ok 300
751     ok 301
752     ok 302
753     ok 303
754     ok 304
755     ok 305
756     ok 306
757     ok 307
758     ok 308
759     ok 309
760     ok 310
761     ok 311
762     ok 312
763     ok 313
764     ok 314
765     ok 315
766     ok 316
767     ok 317
768     ok 318
769     ok 319
770     ok 320
771     ok 321
772     ok 322
773     ok 323
774     ok 324
775     ok 325
776     ok 326
777     ok 327
778     ok 328
779     ok 329
780     ok 330
781     ok 331
782     ok 332
783     ok 333
784     ok 334
785     ok 335
786     ok 336
787     ok 337
788 wakaba 1.59 ok 338
789     ok 339
790     ok 340
791     ok 341
792     ok 342
793     ok 343
794     ok 344
795     ok 345
796     ok 346
797     ok 347
798 wakaba 1.62 ok 348
799     ok 349
800     ok 350
801     ok 351
802     ok 352
803     ok 353
804     ok 354
805     ok 355
806     ok 356
807     ok 357
808     ok 358
809     ok 359
810 wakaba 1.96 ok 360
811     ok 361
812     ok 362
813     ok 363
814 wakaba 1.129 ok 364
815     ok 365
816     ok 366
817     ok 367
818     ok 368
819     ok 369
820     ok 370
821     ok 371
822     ok 372
823     ok 373
824     ok 374
825     ok 375
826     ok 376
827     ok 377
828     ok 378
829     ok 379
830     ok 380
831     ok 381
832     ok 382
833     ok 383
834     ok 384
835     ok 385
836     ok 386
837     ok 387
838     ok 388
839     ok 389
840     ok 390
841     ok 391
842     ok 392
843     ok 393
844     ok 394
845     ok 395
846     ok 396
847 wakaba 1.130 ok 397
848     ok 398
849     ok 399
850     ok 400
851     ok 401
852     ok 402
853     ok 403
854     ok 404
855     ok 405
856     ok 406
857     ok 407
858     ok 408
859     ok 409
860     ok 410
861     ok 411
862     ok 412
863     ok 413
864     ok 414
865     ok 415
866     ok 416
867 wakaba 1.132 ok 417
868     ok 418
869     ok 419
870     ok 420
871 wakaba 1.136 ok 421
872     ok 422
873     ok 423
874     ok 424
875     ok 425
876     ok 426
877     ok 427
878     ok 428
879     ok 429
880     ok 430
881     ok 431
882     ok 432
883     ok 433
884     ok 434
885 wakaba 1.205 ok 435
886 wakaba 1.136 ok 436
887     ok 437
888     ok 438
889 wakaba 1.205 ok 439
890 wakaba 1.136 ok 440
891     ok 441
892     ok 442
893 wakaba 1.205 ok 443
894 wakaba 1.136 ok 444
895     ok 445
896 wakaba 1.205 ok 446
897 wakaba 1.136 ok 447
898     ok 448
899     ok 449
900     ok 450
901     ok 451
902     ok 452
903     ok 453
904     ok 454
905     ok 455
906     ok 456
907     ok 457
908     ok 458
909     ok 459
910     ok 460
911     ok 461
912     ok 462
913     ok 463
914     ok 464
915     ok 465
916     ok 466
917     ok 467
918     ok 468
919     ok 469
920     ok 470
921     ok 471
922 wakaba 1.141 ok 472
923 wakaba 1.195 ok 473
924     ok 474
925     ok 475
926     ok 476
927     ok 477
928 wakaba 1.205 ok 478
929     ok 479
930     ok 480
931     ok 481
932     ok 482
933     ok 483
934     ok 484
935     ok 485
936     ok 486
937     ok 487
938     ok 488
939     ok 489
940     ok 490
941     ok 491
942     ok 492
943     ok 493
944     ok 494
945     ok 495
946     ok 496
947     ok 497
948     ok 498
949     ok 499
950     ok 500
951     ok 501
952     ok 502
953     ok 503
954     ok 504
955     ok 505
956     ok 506
957     ok 507
958     ok 508
959     ok 509
960     ok 510
961     ok 511
962     ok 512
963     ok 513
964     ok 514
965     ok 515
966     ok 516
967     ok 517
968     ok 518
969     ok 519
970     ok 520
971     ok 521
972     ok 522
973     ok 523
974     ok 524
975     ok 525
976     ok 526
977     ok 527
978     ok 528
979     ok 529
980     ok 530
981     ok 531
982     ok 532
983     ok 533
984     ok 534
985     ok 535
986     ok 536
987     ok 537
988     ok 538
989     ok 539
990 wakaba 1.210 ok 540
991 wakaba 1.205 ok 541
992     ok 542
993     ok 543
994     ok 544
995     ok 545
996     ok 546
997     ok 547
998     ok 548
999     ok 549
1000     ok 550
1001     ok 551
1002     ok 552
1003     ok 553
1004     ok 554
1005     ok 555
1006     ok 556
1007     ok 557
1008     ok 558
1009     ok 559
1010     ok 560
1011     ok 561
1012     ok 562
1013     ok 563
1014     ok 564
1015     ok 565
1016     ok 566
1017     ok 567
1018     ok 568
1019     ok 569
1020     ok 570
1021     ok 571
1022     ok 572
1023     ok 573
1024     ok 574
1025     ok 575
1026     ok 576
1027     ok 577
1028     ok 578
1029     ok 579
1030     ok 580
1031     ok 581
1032     ok 582
1033     ok 583
1034     ok 584
1035     ok 585
1036     ok 586
1037     ok 587
1038     ok 588
1039     ok 589
1040     ok 590
1041     ok 591
1042     ok 592
1043     ok 593
1044     ok 594
1045     ok 595
1046     ok 596
1047     ok 597
1048     ok 598
1049     ok 599
1050     ok 600
1051     ok 601
1052     ok 602
1053     ok 603
1054     ok 604
1055     ok 605
1056     ok 606
1057     ok 607
1058     ok 608
1059     ok 609
1060     ok 610
1061     ok 611
1062     ok 612
1063     ok 613
1064     ok 614
1065     ok 615
1066     ok 616
1067     ok 617
1068     ok 618
1069     ok 619
1070     ok 620
1071     ok 621
1072     ok 622
1073     ok 623
1074     ok 624
1075     ok 625
1076     ok 626
1077     ok 627
1078     ok 628
1079     ok 629
1080     ok 630
1081     ok 631
1082     ok 632
1083     ok 633
1084     ok 634
1085     ok 635
1086     ok 636
1087     ok 637
1088     ok 638
1089     ok 639
1090     ok 640
1091     ok 641
1092     ok 642
1093     ok 643
1094     ok 644
1095     ok 645
1096     ok 646
1097     ok 647
1098     ok 648
1099     ok 649
1100     ok 650
1101     ok 651
1102     ok 652
1103     ok 653
1104     ok 654
1105     ok 655
1106     ok 656
1107     ok 657
1108     ok 658
1109     ok 659
1110     ok 660
1111     ok 661
1112     ok 662
1113     ok 663
1114     ok 664
1115     ok 665
1116     ok 666
1117     ok 667
1118     ok 668
1119     ok 669
1120     ok 670
1121     ok 671
1122     ok 672
1123     ok 673
1124     ok 674
1125     ok 675
1126     ok 676
1127     ok 677
1128     ok 678
1129     ok 679
1130     ok 680
1131     ok 681
1132     ok 682
1133     ok 683
1134     ok 684
1135     ok 685
1136     ok 686
1137     ok 687
1138     ok 688
1139     ok 689
1140     ok 690
1141     ok 691
1142     ok 692
1143     ok 693
1144     ok 694
1145     ok 695
1146     ok 696
1147     ok 697
1148     ok 698
1149     ok 699
1150     ok 700
1151     ok 701
1152     ok 702
1153     ok 703
1154     ok 704
1155     ok 705
1156     ok 706
1157     ok 707
1158     ok 708
1159     ok 709
1160     ok 710
1161     ok 711
1162     ok 712
1163     ok 713
1164     ok 714
1165     ok 715
1166     ok 716
1167     ok 717
1168     ok 718
1169     ok 719
1170     ok 720
1171     ok 721
1172     ok 722
1173     ok 723
1174     ok 724
1175     ok 725
1176     ok 726
1177     ok 727
1178     ok 728
1179     ok 729
1180     ok 730
1181     ok 731
1182     ok 732
1183     ok 733
1184     ok 734
1185     ok 735
1186     ok 736
1187     ok 737
1188     ok 738
1189     ok 739
1190     ok 740
1191     ok 741
1192     ok 742
1193     ok 743
1194     ok 744
1195     ok 745
1196     ok 746
1197     ok 747
1198     ok 748
1199     ok 749
1200     ok 750
1201     ok 751
1202     ok 752
1203     ok 753
1204     ok 754
1205     ok 755
1206     ok 756
1207     ok 757
1208     ok 758
1209     ok 759
1210     ok 760
1211     ok 761
1212     ok 762
1213     ok 763
1214     ok 764
1215     ok 765
1216     ok 766
1217     ok 767
1218     ok 768
1219     ok 769
1220     ok 770
1221     ok 771
1222     ok 772
1223     ok 773
1224     ok 774
1225     ok 775
1226     ok 776
1227     ok 777
1228     ok 778
1229     ok 779
1230     ok 780
1231     ok 781
1232     ok 782
1233     ok 783
1234     ok 784
1235     ok 785
1236     ok 786
1237     ok 787
1238     ok 788
1239     ok 789
1240     ok 790
1241     ok 791
1242     ok 792
1243     ok 793
1244     ok 794
1245     ok 795
1246     ok 796
1247     ok 797
1248     ok 798
1249     ok 799
1250     ok 800
1251     ok 801
1252     ok 802
1253     ok 803
1254     ok 804
1255     ok 805
1256     ok 806
1257     ok 807
1258     ok 808
1259     ok 809
1260     ok 810
1261     ok 811
1262     ok 812
1263     ok 813
1264     ok 814
1265     ok 815
1266     ok 816
1267     ok 817
1268     ok 818
1269     ok 819
1270     ok 820
1271     ok 821
1272     ok 822
1273     ok 823
1274     ok 824
1275     ok 825
1276     ok 826
1277     ok 827
1278     ok 828
1279     ok 829
1280     ok 830
1281     ok 831
1282     ok 832
1283     ok 833
1284     ok 834
1285     ok 835
1286     ok 836
1287     ok 837
1288     ok 838
1289     ok 839
1290     ok 840
1291     ok 841
1292     ok 842
1293     ok 843
1294     ok 844
1295     ok 845
1296 wakaba 1.286 ok 846
1297     ok 847
1298     ok 848
1299     ok 849
1300     ok 850
1301 wakaba 1.205 # t/tokenizer/xmlViolation.test
1302 wakaba 1.286 not ok 851
1303     # Test 851 got: "$VAR1 = [\n qq'ParseError',\n [\n qq'Character',\n qq'a\\x{FFFF}b'\n ]\n ];\n" (t/HTML-tokenizer.t at line 205 fail #851)
1304 wakaba 1.206 # Expected: "$VAR1 = [\n qq'ParseError',\n [\n qq'Character',\n qq'a\\x{FFFD}b'\n ]\n ];\n" (Non-XML character: qq'a\x{FFFF}b')
1305     # Line 5 is changed:
1306     # - " qq'a\\x{FFFD}b'\n"
1307     # + " qq'a\\x{FFFF}b'\n"
1308 wakaba 1.286 not ok 852
1309     # Test 852 got: "$VAR1 = [\n [\n qq'Character',\n qq'a\\x{0C}b'\n ]\n ];\n" (t/HTML-tokenizer.t at line 205 fail #852)
1310 wakaba 1.206 # Expected: "$VAR1 = [\n [\n qq'Character',\n qq'a b'\n ]\n ];\n" (Non-XML space: qq'a\x{0C}b')
1311     # Line 4 is changed:
1312     # - " qq'a b'\n"
1313     # + " qq'a\\x{0C}b'\n"
1314 wakaba 1.286 not ok 853
1315 wakaba 1.302 # Test 853 got: "$VAR1 = [\n qq'ParseError',\n [\n qq'Comment',\n qq' foo -- bar '\n ]\n ];\n" (t/HTML-tokenizer.t at line 205 fail #853)
1316 wakaba 1.206 # Expected: "$VAR1 = [\n qq'ParseError',\n [\n qq'Comment',\n qq' foo - - bar '\n ]\n ];\n" (Double hyphen in comment: qq'<!-- foo -- bar -->')
1317 wakaba 1.302 # Line 5 is changed:
1318 wakaba 1.206 # - " qq' foo - - bar '\n"
1319     # + " qq' foo -- bar '\n"
1320 wakaba 1.286 ok 854
1321 wakaba 1.205 # t/tokenizer-test-1.test
1322     ok 855
1323     ok 856
1324     ok 857
1325     ok 858
1326     ok 859
1327     ok 860
1328     ok 861
1329     ok 862
1330     ok 863
1331     ok 864
1332     ok 865
1333     ok 866
1334     ok 867
1335     ok 868
1336     ok 869
1337     ok 870
1338     ok 871
1339     ok 872
1340     ok 873
1341     ok 874
1342     ok 875
1343     ok 876
1344     ok 877
1345     ok 878
1346     ok 879
1347     ok 880
1348     ok 881
1349     ok 882
1350     ok 883
1351     ok 884
1352     ok 885
1353     ok 886
1354     ok 887
1355     ok 888
1356     ok 889
1357     ok 890
1358     ok 891
1359     ok 892
1360     ok 893
1361     ok 894
1362     ok 895
1363     ok 896
1364     ok 897
1365     ok 898
1366     ok 899
1367     ok 900
1368     ok 901
1369     ok 902
1370     ok 903
1371     ok 904
1372     ok 905
1373     ok 906
1374     ok 907
1375     ok 908
1376     ok 909
1377     ok 910
1378     ok 911
1379     ok 912
1380     ok 913
1381     ok 914
1382     ok 915
1383     ok 916
1384     ok 917
1385     ok 918
1386     ok 919
1387     ok 920
1388     ok 921
1389     ok 922
1390     ok 923
1391     ok 924
1392     ok 925
1393 wakaba 1.298 ok 926
1394     ok 927
1395     not ok 928
1396     # Test 928 got: "$VAR1 = [\n [\n qq'Comment',\n qq'--x'\n ]\n ];\n" (t/HTML-tokenizer.t at line 205 fail #928)
1397 wakaba 1.296 # Expected: "$VAR1 = [\n qq'ParseError',\n [\n qq'Comment',\n qq'--x'\n ]\n ];\n" (<!----x-->: qq'<!----x-->')
1398     # Line 2 is missing:
1399     # - " qq'ParseError',\n"
1400 wakaba 1.205 ok 929
1401     ok 930
1402     ok 931
1403     ok 932
1404     ok 933
1405     ok 934
1406     ok 935
1407     ok 936
1408     ok 937
1409 wakaba 1.281 ok 938
1410     ok 939
1411     ok 940
1412     ok 941
1413     ok 942
1414     ok 943
1415     ok 944
1416     ok 945
1417 wakaba 1.285 ok 946
1418 wakaba 1.205 ok 947
1419     ok 948
1420     ok 949
1421     ok 950
1422     ok 951
1423     ok 952
1424     ok 953
1425     ok 954
1426     ok 955
1427     ok 956
1428     ok 957
1429     ok 958
1430     ok 959
1431     ok 960
1432     ok 961
1433     ok 962
1434 wakaba 1.286 ok 963
1435     ok 964
1436 wakaba 1.290 ok 965
1437     ok 966
1438     ok 967
1439 wakaba 1.298 ok 968
1440     ok 969
1441     not ok 970
1442     # Test 970 got: "$VAR1 = [\n qq'ParseError',\n qq'ParseError',\n [\n qq'Character',\n qq'\\x{FFFD}\\x{DFFF}'\n ]\n ];\n" (t/HTML-tokenizer.t at line 205 fail #970)
1443 wakaba 1.285 # Expected: "$VAR1 = [\n qq'ParseError',\n [\n qq'Character',\n qq'\\x{FFFD}'\n ],\n qq'ParseError',\n [\n qq'Character',\n qq'\\x{DFFF}'\n ]\n ];\n" (surrogate character reference: qq'&#xD800;\x{DFFF}')
1444     # Lines 3-3 are missing:
1445     # - " [\n"
1446     # - " qq'Character',\n"
1447     # - " qq'\\x{FFFD}'\n"
1448     # - " ],\n"
1449     # Line 6 is changed:
1450     # - " qq'\\x{DFFF}'\n"
1451     # + " qq'\\x{FFFD}\\x{DFFF}'\n"
1452 wakaba 1.205 ok 971
1453     ok 972
1454     ok 973
1455     ok 974
1456     ok 975
1457     ok 976
1458     ok 977
1459     ok 978
1460     ok 979
1461     ok 980
1462     ok 981
1463     ok 982
1464     ok 983
1465     ok 984
1466     ok 985
1467     ok 986
1468     ok 987
1469     ok 988
1470     ok 989
1471     ok 990
1472     ok 991
1473     ok 992
1474     ok 993
1475     ok 994
1476     ok 995
1477     ok 996
1478     ok 997
1479     ok 998
1480     ok 999
1481     ok 1000
1482     ok 1001
1483     ok 1002
1484     ok 1003
1485     ok 1004
1486     ok 1005
1487     ok 1006
1488     ok 1007
1489     ok 1008
1490     ok 1009
1491     ok 1010
1492     ok 1011
1493     ok 1012
1494     ok 1013
1495     ok 1014
1496     ok 1015
1497     ok 1016
1498     ok 1017
1499     ok 1018
1500 wakaba 1.206 ok 1019
1501     ok 1020
1502 wakaba 1.312 not ok 1021
1503     # Test 1021 got: "$VAR1 = [\n qq'ParseError',\n qq'ParseError'\n ];\n" (t/HTML-tokenizer.t at line 205 fail #1021)
1504     # Expected: "$VAR1 = [\n qq'ParseError',\n qq'ParseError',\n [\n qq'StartTag',\n qq'a',\n {\n qq'href' => qq'\\x{A9}'\n }\n ]\n ];\n" (entity w/o refc at the end of unterminated attribute value: qq'<a href=\x{27}&COPY')
1505     # Line 3 is changed:
1506     # - " qq'ParseError',\n"
1507     # + " qq'ParseError'\n"
1508     # Lines 4-4 are missing:
1509     # - " [\n"
1510     # - " qq'StartTag',\n"
1511     # - " qq'a',\n"
1512     # - " {\n"
1513     # - " qq'href' => qq'\\x{A9}'\n"
1514     # - " }\n"
1515     # - " ]\n"
1516 wakaba 1.206 ok 1022
1517     ok 1023
1518     ok 1024
1519     ok 1025
1520 wakaba 1.240 ok 1026
1521 wakaba 1.206 ok 1027
1522     ok 1028
1523     ok 1029
1524 wakaba 1.240 ok 1030
1525 wakaba 1.206 ok 1031
1526     ok 1032
1527     ok 1033
1528 wakaba 1.240 ok 1034
1529 wakaba 1.206 ok 1035
1530     ok 1036
1531 wakaba 1.240 ok 1037
1532 wakaba 1.205 ok 1038
1533     ok 1039
1534 wakaba 1.298 ok 1040
1535     ok 1041
1536 wakaba 1.299 ok 1042
1537 wakaba 1.298 ok 1043
1538 wakaba 1.299 ok 1044
1539 wakaba 1.205 ok 1045
1540 wakaba 1.312 not ok 1046
1541     # Test 1046 got: "$VAR1 = [\n qq'ParseError'\n ];\n" (t/HTML-tokenizer.t at line 205 fail #1046)
1542     # Expected: "$VAR1 = [\n qq'ParseError',\n [\n qq'StartTag',\n qq'p',\n {\n qq'align' => qq'left<div>'\n }\n ]\n ];\n" (< in attribute value (single-unquoted) state: qq'<p align=\x{27}left<div>')
1543     # Line 2 is changed:
1544     # - " qq'ParseError',\n"
1545     # + " qq'ParseError'\n"
1546     # Lines 3-3 are missing:
1547     # - " [\n"
1548     # - " qq'StartTag',\n"
1549     # - " qq'p',\n"
1550     # - " {\n"
1551     # - " qq'align' => qq'left<div>'\n"
1552     # - " }\n"
1553     # - " ]\n"
1554 wakaba 1.205 ok 1047
1555     ok 1048
1556     ok 1049
1557     ok 1050
1558     ok 1051
1559     ok 1052
1560     ok 1053
1561     ok 1054
1562     ok 1055
1563     ok 1056
1564     ok 1057
1565     ok 1058
1566     ok 1059
1567     ok 1060
1568     ok 1061
1569 wakaba 1.206 ok 1062
1570     ok 1063
1571     ok 1064
1572     ok 1065
1573     ok 1066
1574     ok 1067
1575     ok 1068
1576 wakaba 1.227 ok 1069
1577     ok 1070
1578     ok 1071
1579     ok 1072
1580     ok 1073
1581 wakaba 1.247 ok 1074
1582     ok 1075
1583     ok 1076
1584     ok 1077
1585     ok 1078
1586     ok 1079
1587     ok 1080
1588 wakaba 1.281 ok 1081
1589     ok 1082
1590     ok 1083
1591     ok 1084
1592     ok 1085
1593     ok 1086
1594     ok 1087
1595     ok 1088
1596     ok 1089
1597     ok 1090
1598     ok 1091
1599     ok 1092
1600     ok 1093
1601     ok 1094
1602     ok 1095
1603     ok 1096
1604     ok 1097
1605 wakaba 1.285 ok 1098
1606     ok 1099
1607     ok 1100
1608     ok 1101
1609     ok 1102
1610     ok 1103
1611     ok 1104
1612     ok 1105
1613 wakaba 1.305 ok 1106
1614 wakaba 1.285 ok 1107
1615     ok 1108
1616     ok 1109
1617 wakaba 1.305 ok 1110
1618 wakaba 1.285 ok 1111
1619     ok 1112
1620     ok 1113
1621     ok 1114
1622     ok 1115
1623     ok 1116
1624     ok 1117
1625     ok 1118
1626     ok 1119
1627     ok 1120
1628     ok 1121
1629     ok 1122
1630     ok 1123
1631 wakaba 1.312 not ok 1124
1632     # Test 1124 got: "$VAR1 = [\n qq'ParseError'\n ];\n" (t/HTML-tokenizer.t at line 205 fail #1124)
1633     # Expected: "$VAR1 = [\n qq'ParseError',\n [\n qq'StartTag',\n qq'a',\n {\n qq'a' => qq'>'\n }\n ]\n ];\n" (<a a='>: qq'<a a=\x{27}>')
1634     # Line 2 is changed:
1635     # - " qq'ParseError',\n"
1636     # + " qq'ParseError'\n"
1637     # Lines 3-3 are missing:
1638     # - " [\n"
1639     # - " qq'StartTag',\n"
1640     # - " qq'a',\n"
1641     # - " {\n"
1642     # - " qq'a' => qq'>'\n"
1643     # - " }\n"
1644     # - " ]\n"
1645 wakaba 1.285 ok 1125
1646     ok 1126
1647     ok 1127
1648 wakaba 1.305 ok 1128
1649 wakaba 1.286 ok 1129
1650 wakaba 1.305 ok 1130
1651 wakaba 1.290 ok 1131
1652     ok 1132
1653 wakaba 1.293 ok 1133
1654     ok 1134
1655 wakaba 1.308 ok 1135
1656 wakaba 1.298 ok 1136
1657 wakaba 1.309 ok 1137
1658 wakaba 1.310 ok 1138
1659 wakaba 1.313 ok 1139
1660 wakaba 1.301 ok 1140
1661 wakaba 1.315 ok 1141
1662 wakaba 1.305 ok 1142
1663     ok 1143
1664 wakaba 1.302 ok 1144
1665     ok 1145
1666 wakaba 1.316 ok 1146
1667     ok 1147
1668 wakaba 1.317 ok 1148
1669     ok 1149
1670     ok 1150

admin@suikawiki.org
ViewVC Help
Powered by ViewVC 1.1.24