/[suikacvs]/markup/html/whatpm/t/tokenizer-result.txt
Suika

Contents of /markup/html/whatpm/t/tokenizer-result.txt

Parent Directory Parent Directory | Revision Log Revision Log


Revision 1.130 - (hide annotations) (download)
Sun Mar 2 14:41:28 2008 UTC (17 years, 4 months ago) by wakaba
Branch: MAIN
Changes since 1.129: +39 -35 lines
File MIME type: text/plain
Results of new test data

1 wakaba 1.130 1..416
2 wakaba 1.1 # Running under perl version 5.008007 for linux
3 wakaba 1.130 # Current time local: Sun Mar 2 23:40:02 2008
4     # Current time GMT: Sun Mar 2 14:40:02 2008
5 wakaba 1.1 # Using Test.pm version 1.25
6 wakaba 1.11 # t/tokenizer/test1.test
7 wakaba 1.20 ok 1
8     ok 2
9     ok 3
10 wakaba 1.1 ok 4
11 wakaba 1.20 ok 5
12 wakaba 1.1 ok 6
13     ok 7
14     ok 8
15     ok 9
16     ok 10
17     ok 11
18     ok 12
19     ok 13
20     ok 14
21 wakaba 1.130 ok 15
22 wakaba 1.1 ok 16
23     ok 17
24     ok 18
25     ok 19
26     ok 20
27     ok 21
28 wakaba 1.25 ok 22
29     ok 23
30 wakaba 1.1 ok 24
31 wakaba 1.22 ok 25
32     ok 26
33     ok 27
34 wakaba 1.1 ok 28
35     ok 29
36     ok 30
37     ok 31
38     ok 32
39     ok 33
40 wakaba 1.18 ok 34
41 wakaba 1.1 ok 35
42     ok 36
43     ok 37
44 wakaba 1.8 ok 38
45 wakaba 1.28 ok 39
46     ok 40
47 wakaba 1.43 ok 41
48     ok 42
49 wakaba 1.11 # t/tokenizer/test2.test
50 wakaba 1.43 not ok 43
51 wakaba 1.48 # Test 43 got: "$VAR1 = [\n qq'ParseError',\n qq'ParseError',\n [\n qq'DOCTYPE',\n undef,\n undef,\n undef,\n 0\n ]\n ];\n" (t/HTML-tokenizer.t at line 158 fail #43)
52 wakaba 1.47 # Expected: "$VAR1 = [\n qq'ParseError',\n qq'ParseError',\n [\n qq'DOCTYPE',\n qq'',\n undef,\n undef,\n 0\n ]\n ];\n" (DOCTYPE without name: qq'<!DOCTYPE>')
53 wakaba 1.20 # Line 6 is changed:
54 wakaba 1.8 # - " qq'',\n"
55 wakaba 1.20 # + " undef,\n"
56 wakaba 1.130 # t/HTML-tokenizer.t line 158 is: ok $parser_dump, $expected_dump,
57 wakaba 1.20 ok 44
58     ok 45
59     ok 46
60     ok 47
61     ok 48
62     ok 49
63     ok 50
64     ok 51
65 wakaba 1.97 ok 52
66     ok 53
67     ok 54
68     ok 55
69 wakaba 1.9 ok 56
70     ok 57
71 wakaba 1.1 ok 58
72     ok 59
73     ok 60
74 wakaba 1.19 ok 61
75 wakaba 1.1 ok 62
76     ok 63
77 wakaba 1.130 ok 64
78 wakaba 1.1 ok 65
79     ok 66
80     ok 67
81     ok 68
82     ok 69
83     ok 70
84 wakaba 1.34 ok 71
85     ok 72
86 wakaba 1.1 ok 73
87     ok 74
88 wakaba 1.21 ok 75
89     ok 76
90 wakaba 1.1 ok 77
91 wakaba 1.96 # t/tokenizer/test3.test
92 wakaba 1.1 ok 78
93     ok 79
94     ok 80
95 wakaba 1.34 ok 81
96 wakaba 1.15 ok 82
97 wakaba 1.1 ok 83
98     ok 84
99 wakaba 1.25 ok 85
100     ok 86
101 wakaba 1.34 ok 87
102 wakaba 1.1 ok 88
103     ok 89
104     ok 90
105     ok 91
106     ok 92
107     ok 93
108     ok 94
109 wakaba 1.8 ok 95
110     ok 96
111     ok 97
112     ok 98
113     ok 99
114     ok 100
115 wakaba 1.96 ok 101
116     ok 102
117     ok 103
118     ok 104
119     not ok 105
120     # Test 105 got: "$VAR1 = [\n qq'ParseError',\n [\n qq'DOCTYPE',\n undef,\n undef,\n undef,\n 0\n ]\n ];\n" (t/HTML-tokenizer.t at line 158 fail #105)
121 wakaba 1.47 # Expected: "$VAR1 = [\n qq'ParseError',\n [\n qq'DOCTYPE',\n qq'',\n undef,\n undef,\n 0\n ]\n ];\n" (<!doctype >: qq'<!doctype >')
122 wakaba 1.43 # Line 5 is changed:
123     # - " qq'',\n"
124     # + " undef,\n"
125 wakaba 1.96 not ok 106
126     # Test 106 got: "$VAR1 = [\n qq'ParseError',\n [\n qq'DOCTYPE',\n undef,\n undef,\n undef,\n 0\n ]\n ];\n" (t/HTML-tokenizer.t at line 158 fail #106)
127 wakaba 1.47 # Expected: "$VAR1 = [\n qq'ParseError',\n [\n qq'DOCTYPE',\n qq'',\n undef,\n undef,\n 0\n ]\n ];\n" (<!doctype : qq'<!doctype ')
128 wakaba 1.43 # Line 5 is changed:
129     # - " qq'',\n"
130     # + " undef,\n"
131 wakaba 1.8 ok 107
132     ok 108
133     ok 109
134     ok 110
135     ok 111
136     ok 112
137     ok 113
138 wakaba 1.10 ok 114
139     ok 115
140     ok 116
141     ok 117
142     ok 118
143     ok 119
144     ok 120
145     ok 121
146 wakaba 1.39 ok 122
147 wakaba 1.18 ok 123
148     ok 124
149     ok 125
150     ok 126
151 wakaba 1.20 ok 127
152     ok 128
153     ok 129
154     ok 130
155     ok 131
156     ok 132
157     ok 133
158     ok 134
159     ok 135
160     ok 136
161 wakaba 1.21 ok 137
162     ok 138
163 wakaba 1.20 ok 139
164     ok 140
165     ok 141
166 wakaba 1.28 ok 142
167 wakaba 1.20 ok 143
168     ok 144
169     ok 145
170     ok 146
171 wakaba 1.130 ok 147
172 wakaba 1.22 ok 148
173     ok 149
174     ok 150
175 wakaba 1.130 ok 151
176 wakaba 1.22 ok 152
177     ok 153
178     ok 154
179     ok 155
180     ok 156
181 wakaba 1.28 ok 157
182     ok 158
183     ok 159
184     ok 160
185     ok 161
186     ok 162
187     ok 163
188     ok 164
189     ok 165
190     ok 166
191     ok 167
192     ok 168
193 wakaba 1.96 # t/tokenizer/test4.test
194 wakaba 1.28 ok 169
195     ok 170
196     ok 171
197     ok 172
198     ok 173
199     ok 174
200     ok 175
201     ok 176
202     ok 177
203     ok 178
204 wakaba 1.33 ok 179
205 wakaba 1.34 ok 180
206 wakaba 1.38 ok 181
207     ok 182
208 wakaba 1.43 ok 183
209     ok 184
210     ok 185
211     ok 186
212     ok 187
213     ok 188
214     ok 189
215     ok 190
216     ok 191
217     ok 192
218     ok 193
219     ok 194
220     ok 195
221     ok 196
222     ok 197
223 wakaba 1.96 ok 198
224     ok 199
225     ok 200
226     ok 201
227 wakaba 1.130 ok 202
228 wakaba 1.43 ok 203
229     ok 204
230     ok 205
231     ok 206
232     ok 207
233     ok 208
234     ok 209
235     ok 210
236     ok 211
237     ok 212
238     ok 213
239     ok 214
240     ok 215
241     ok 216
242     ok 217
243     ok 218
244     ok 219
245     ok 220
246 wakaba 1.130 not ok 221
247     # Test 221 got: "$VAR1 = [\n qq'ParseError',\n qq'ParseError',\n [\n qq'Comment',\n qq'doc'\n ],\n [\n qq'Character',\n qq'\\x{FFFD}'\n ]\n ];\n" (t/HTML-tokenizer.t at line 158 fail #221)
248     # Expected: "$VAR1 = [\n qq'ParseError',\n [\n qq'Comment',\n qq'doc'\n ],\n qq'ParseError',\n [\n qq'Character',\n qq'\\x{FFFD}'\n ]\n ];\n" (U+0000 in lookahead region after non-matching character: qq'<!doc>\x{00}')
249     # Got 1 extra line at line 3:
250     # + " qq'ParseError',\n"
251     # Line 8 is missing:
252     # - " qq'ParseError',\n"
253 wakaba 1.43 ok 222
254     ok 223
255     ok 224
256     ok 225
257     ok 226
258     ok 227
259     ok 228
260     ok 229
261     ok 230
262     ok 231
263     ok 232
264     ok 233
265     ok 234
266     ok 235
267 wakaba 1.130 # t/tokenizer/contentModelFlags.test
268 wakaba 1.43 ok 236
269     ok 237
270     ok 238
271     ok 239
272     ok 240
273     ok 241
274     ok 242
275     ok 243
276     ok 244
277     ok 245
278     ok 246
279     ok 247
280     ok 248
281 wakaba 1.130 # t/tokenizer/escapeFlag.test
282 wakaba 1.43 ok 249
283     ok 250
284     ok 251
285     ok 252
286     ok 253
287     ok 254
288     ok 255
289 wakaba 1.130 # t/tokenizer-test-1.test
290 wakaba 1.43 ok 256
291     ok 257
292     ok 258
293     ok 259
294     ok 260
295     ok 261
296     ok 262
297     ok 263
298     ok 264
299     ok 265
300     ok 266
301     ok 267
302     ok 268
303     ok 269
304     ok 270
305     ok 271
306     ok 272
307     ok 273
308     ok 274
309     ok 275
310     ok 276
311     ok 277
312     ok 278
313     ok 279
314     ok 280
315     ok 281
316     ok 282
317     ok 283
318     ok 284
319     ok 285
320     ok 286
321     ok 287
322     ok 288
323     ok 289
324     ok 290
325     ok 291
326     ok 292
327     ok 293
328     ok 294
329     ok 295
330     ok 296
331     ok 297
332     ok 298
333     ok 299
334     ok 300
335     ok 301
336     ok 302
337     ok 303
338     ok 304
339     ok 305
340     ok 306
341     ok 307
342     ok 308
343     ok 309
344     ok 310
345     ok 311
346     ok 312
347     ok 313
348     ok 314
349     ok 315
350     ok 316
351     ok 317
352     ok 318
353     ok 319
354     ok 320
355     ok 321
356     ok 322
357     ok 323
358     ok 324
359     ok 325
360     ok 326
361     ok 327
362     ok 328
363     ok 329
364     ok 330
365     ok 331
366     ok 332
367     ok 333
368     ok 334
369     ok 335
370     ok 336
371     ok 337
372 wakaba 1.59 ok 338
373     ok 339
374     ok 340
375     ok 341
376     ok 342
377     ok 343
378     ok 344
379     ok 345
380     ok 346
381     ok 347
382 wakaba 1.62 ok 348
383     ok 349
384     ok 350
385     ok 351
386     ok 352
387     ok 353
388     ok 354
389     ok 355
390     ok 356
391     ok 357
392     ok 358
393     ok 359
394 wakaba 1.96 ok 360
395     ok 361
396     ok 362
397     ok 363
398 wakaba 1.129 ok 364
399     ok 365
400     ok 366
401     ok 367
402     ok 368
403     ok 369
404     ok 370
405     ok 371
406     ok 372
407     ok 373
408     ok 374
409     ok 375
410     ok 376
411     ok 377
412     ok 378
413     ok 379
414     ok 380
415     ok 381
416     ok 382
417     ok 383
418     ok 384
419     ok 385
420     ok 386
421     ok 387
422     ok 388
423     ok 389
424     ok 390
425     ok 391
426     ok 392
427     ok 393
428     ok 394
429     ok 395
430     ok 396
431 wakaba 1.130 ok 397
432     ok 398
433     ok 399
434     ok 400
435     ok 401
436     ok 402
437     ok 403
438     ok 404
439     ok 405
440     ok 406
441     ok 407
442     ok 408
443     ok 409
444     ok 410
445     ok 411
446     ok 412
447     ok 413
448     ok 414
449     ok 415
450     ok 416

admin@suikawiki.org
ViewVC Help
Powered by ViewVC 1.1.24