/[suikacvs]/markup/html/whatpm/t/tokenizer-result.txt
Suika

Contents of /markup/html/whatpm/t/tokenizer-result.txt

Parent Directory Parent Directory | Revision Log Revision Log


Revision 1.195 - (hide annotations) (download)
Fri Mar 28 14:23:10 2008 UTC (17 years, 3 months ago) by wakaba
Branch: MAIN
Changes since 1.194: +25 -8 lines
File MIME type: text/plain
Results updated

1 wakaba 1.141 1..472
2 wakaba 1.1 # Running under perl version 5.008007 for linux
3 wakaba 1.195 # Current time local: Fri Mar 28 23:14:27 2008
4     # Current time GMT: Fri Mar 28 14:14:27 2008
5 wakaba 1.1 # Using Test.pm version 1.25
6 wakaba 1.11 # t/tokenizer/test1.test
7 wakaba 1.20 ok 1
8     ok 2
9     ok 3
10 wakaba 1.1 ok 4
11 wakaba 1.20 ok 5
12 wakaba 1.1 ok 6
13     ok 7
14     ok 8
15     ok 9
16     ok 10
17     ok 11
18     ok 12
19     ok 13
20     ok 14
21 wakaba 1.130 ok 15
22 wakaba 1.1 ok 16
23     ok 17
24     ok 18
25     ok 19
26     ok 20
27     ok 21
28 wakaba 1.25 ok 22
29     ok 23
30 wakaba 1.1 ok 24
31 wakaba 1.22 ok 25
32     ok 26
33     ok 27
34 wakaba 1.1 ok 28
35     ok 29
36     ok 30
37     ok 31
38     ok 32
39     ok 33
40 wakaba 1.18 ok 34
41 wakaba 1.1 ok 35
42     ok 36
43     ok 37
44 wakaba 1.8 ok 38
45 wakaba 1.28 ok 39
46     ok 40
47 wakaba 1.43 ok 41
48     ok 42
49 wakaba 1.11 # t/tokenizer/test2.test
50 wakaba 1.43 not ok 43
51 wakaba 1.137 # Test 43 got: "$VAR1 = [\n qq'ParseError',\n qq'ParseError',\n [\n qq'DOCTYPE',\n undef,\n undef,\n undef,\n 0\n ]\n ];\n" (t/HTML-tokenizer.t at line 176 fail #43)
52 wakaba 1.47 # Expected: "$VAR1 = [\n qq'ParseError',\n qq'ParseError',\n [\n qq'DOCTYPE',\n qq'',\n undef,\n undef,\n 0\n ]\n ];\n" (DOCTYPE without name: qq'<!DOCTYPE>')
53 wakaba 1.20 # Line 6 is changed:
54 wakaba 1.8 # - " qq'',\n"
55 wakaba 1.20 # + " undef,\n"
56 wakaba 1.137 # t/HTML-tokenizer.t line 176 is: ok $parser_dump, $expected_dump,
57 wakaba 1.20 ok 44
58     ok 45
59     ok 46
60     ok 47
61     ok 48
62     ok 49
63     ok 50
64     ok 51
65 wakaba 1.97 ok 52
66     ok 53
67     ok 54
68     ok 55
69 wakaba 1.9 ok 56
70     ok 57
71 wakaba 1.1 ok 58
72     ok 59
73     ok 60
74 wakaba 1.19 ok 61
75 wakaba 1.1 ok 62
76     ok 63
77 wakaba 1.130 ok 64
78 wakaba 1.1 ok 65
79     ok 66
80     ok 67
81     ok 68
82     ok 69
83     ok 70
84 wakaba 1.34 ok 71
85     ok 72
86 wakaba 1.1 ok 73
87     ok 74
88 wakaba 1.21 ok 75
89     ok 76
90 wakaba 1.1 ok 77
91 wakaba 1.141 ok 78
92 wakaba 1.96 # t/tokenizer/test3.test
93 wakaba 1.1 ok 79
94     ok 80
95 wakaba 1.34 ok 81
96 wakaba 1.15 ok 82
97 wakaba 1.1 ok 83
98     ok 84
99 wakaba 1.25 ok 85
100     ok 86
101 wakaba 1.34 ok 87
102 wakaba 1.1 ok 88
103     ok 89
104     ok 90
105     ok 91
106     ok 92
107     ok 93
108     ok 94
109 wakaba 1.8 ok 95
110     ok 96
111     ok 97
112     ok 98
113     ok 99
114     ok 100
115 wakaba 1.96 ok 101
116     ok 102
117     ok 103
118     ok 104
119 wakaba 1.141 ok 105
120     not ok 106
121     # Test 106 got: "$VAR1 = [\n qq'ParseError',\n [\n qq'DOCTYPE',\n undef,\n undef,\n undef,\n 0\n ]\n ];\n" (t/HTML-tokenizer.t at line 176 fail #106)
122 wakaba 1.47 # Expected: "$VAR1 = [\n qq'ParseError',\n [\n qq'DOCTYPE',\n qq'',\n undef,\n undef,\n 0\n ]\n ];\n" (<!doctype >: qq'<!doctype >')
123 wakaba 1.43 # Line 5 is changed:
124     # - " qq'',\n"
125     # + " undef,\n"
126 wakaba 1.141 not ok 107
127     # Test 107 got: "$VAR1 = [\n qq'ParseError',\n [\n qq'DOCTYPE',\n undef,\n undef,\n undef,\n 0\n ]\n ];\n" (t/HTML-tokenizer.t at line 176 fail #107)
128 wakaba 1.47 # Expected: "$VAR1 = [\n qq'ParseError',\n [\n qq'DOCTYPE',\n qq'',\n undef,\n undef,\n 0\n ]\n ];\n" (<!doctype : qq'<!doctype ')
129 wakaba 1.43 # Line 5 is changed:
130     # - " qq'',\n"
131     # + " undef,\n"
132 wakaba 1.8 ok 108
133     ok 109
134     ok 110
135     ok 111
136     ok 112
137     ok 113
138 wakaba 1.10 ok 114
139     ok 115
140     ok 116
141     ok 117
142     ok 118
143     ok 119
144     ok 120
145     ok 121
146 wakaba 1.39 ok 122
147 wakaba 1.18 ok 123
148     ok 124
149     ok 125
150     ok 126
151 wakaba 1.20 ok 127
152     ok 128
153     ok 129
154     ok 130
155 wakaba 1.141 ok 131
156 wakaba 1.20 ok 132
157     ok 133
158     ok 134
159     ok 135
160     ok 136
161 wakaba 1.21 ok 137
162     ok 138
163 wakaba 1.20 ok 139
164     ok 140
165     ok 141
166 wakaba 1.28 ok 142
167 wakaba 1.20 ok 143
168     ok 144
169     ok 145
170     ok 146
171 wakaba 1.130 ok 147
172 wakaba 1.22 ok 148
173     ok 149
174     ok 150
175 wakaba 1.130 ok 151
176 wakaba 1.22 ok 152
177     ok 153
178     ok 154
179     ok 155
180     ok 156
181 wakaba 1.28 ok 157
182     ok 158
183     ok 159
184     ok 160
185     ok 161
186     ok 162
187     ok 163
188     ok 164
189     ok 165
190     ok 166
191     ok 167
192     ok 168
193 wakaba 1.141 ok 169
194 wakaba 1.96 # t/tokenizer/test4.test
195 wakaba 1.28 ok 170
196     ok 171
197     ok 172
198     ok 173
199     ok 174
200     ok 175
201     ok 176
202     ok 177
203     ok 178
204 wakaba 1.33 ok 179
205 wakaba 1.34 ok 180
206 wakaba 1.38 ok 181
207     ok 182
208 wakaba 1.43 ok 183
209     ok 184
210     ok 185
211     ok 186
212     ok 187
213     ok 188
214     ok 189
215     ok 190
216     ok 191
217     ok 192
218     ok 193
219     ok 194
220     ok 195
221     ok 196
222     ok 197
223 wakaba 1.96 ok 198
224     ok 199
225     ok 200
226     ok 201
227 wakaba 1.130 ok 202
228 wakaba 1.43 ok 203
229     ok 204
230     ok 205
231     ok 206
232     ok 207
233     ok 208
234     ok 209
235     ok 210
236     ok 211
237     ok 212
238     ok 213
239     ok 214
240     ok 215
241     ok 216
242     ok 217
243     ok 218
244     ok 219
245     ok 220
246 wakaba 1.141 ok 221
247     not ok 222
248     # Test 222 got: "$VAR1 = [\n qq'ParseError',\n qq'ParseError',\n [\n qq'Comment',\n qq'doc'\n ],\n [\n qq'Character',\n qq'\\x{FFFD}'\n ]\n ];\n" (t/HTML-tokenizer.t at line 176 fail #222)
249 wakaba 1.130 # Expected: "$VAR1 = [\n qq'ParseError',\n [\n qq'Comment',\n qq'doc'\n ],\n qq'ParseError',\n [\n qq'Character',\n qq'\\x{FFFD}'\n ]\n ];\n" (U+0000 in lookahead region after non-matching character: qq'<!doc>\x{00}')
250     # Got 1 extra line at line 3:
251     # + " qq'ParseError',\n"
252     # Line 8 is missing:
253     # - " qq'ParseError',\n"
254 wakaba 1.43 ok 223
255 wakaba 1.195 not ok 224
256     # Test 224 got: "$VAR1 = [\n qq'ParseError',\n [\n qq'Comment',\n qq'doc\\x{80}'\n ]\n ];\n" (t/HTML-tokenizer.t at line 176 fail #224)
257     # Expected: "$VAR1 = [\n qq'ParseError',\n qq'ParseError',\n [\n qq'Comment',\n qq'doc\\x{80}'\n ]\n ];\n" (U+0080 in lookahead region: qq'<!doc\x{80}')
258     # Line 3 is missing:
259     # - " qq'ParseError',\n"
260     not ok 225
261     # Test 225 got: "$VAR1 = [\n qq'ParseError',\n [\n qq'Comment',\n qq'doc\\x{FDD1}'\n ]\n ];\n" (t/HTML-tokenizer.t at line 176 fail #225)
262     # Expected: "$VAR1 = [\n qq'ParseError',\n qq'ParseError',\n [\n qq'Comment',\n qq'doc\\x{FDD1}'\n ]\n ];\n" (U+FDD1 in lookahead region: qq'<!doc\x{FDD1}')
263     # Line 3 is missing:
264     # - " qq'ParseError',\n"
265     not ok 226
266     # Test 226 got: "$VAR1 = [\n qq'ParseError',\n [\n qq'Comment',\n qq'doc\\x{1FFFF}'\n ]\n ];\n" (t/HTML-tokenizer.t at line 176 fail #226)
267     # Expected: "$VAR1 = [\n qq'ParseError',\n qq'ParseError',\n [\n qq'Comment',\n qq'doc\\x{1FFFF}'\n ]\n ];\n" (U+1FFFF in lookahead region: qq'<!doc\x{1FFFF}')
268     # Line 3 is missing:
269     # - " qq'ParseError',\n"
270 wakaba 1.43 ok 227
271     ok 228
272     ok 229
273     ok 230
274     ok 231
275     ok 232
276     ok 233
277     ok 234
278     ok 235
279 wakaba 1.141 ok 236
280 wakaba 1.43 ok 237
281     ok 238
282     ok 239
283 wakaba 1.195 # t/tokenizer/contentModelFlags.test
284 wakaba 1.43 ok 240
285     ok 241
286     ok 242
287     ok 243
288     ok 244
289     ok 245
290     ok 246
291     ok 247
292     ok 248
293 wakaba 1.141 ok 249
294 wakaba 1.43 ok 250
295     ok 251
296     ok 252
297 wakaba 1.195 # t/tokenizer/escapeFlag.test
298 wakaba 1.43 ok 253
299     ok 254
300     ok 255
301 wakaba 1.141 ok 256
302 wakaba 1.43 ok 257
303     ok 258
304     ok 259
305     ok 260
306     ok 261
307 wakaba 1.195 # t/tokenizer-test-1.test
308 wakaba 1.43 ok 262
309     ok 263
310     ok 264
311     ok 265
312     ok 266
313     ok 267
314     ok 268
315     ok 269
316     ok 270
317     ok 271
318     ok 272
319     ok 273
320     ok 274
321     ok 275
322     ok 276
323     ok 277
324     ok 278
325     ok 279
326     ok 280
327     ok 281
328     ok 282
329     ok 283
330     ok 284
331     ok 285
332     ok 286
333     ok 287
334     ok 288
335     ok 289
336     ok 290
337     ok 291
338     ok 292
339     ok 293
340     ok 294
341     ok 295
342     ok 296
343     ok 297
344     ok 298
345     ok 299
346     ok 300
347     ok 301
348     ok 302
349     ok 303
350     ok 304
351     ok 305
352     ok 306
353     ok 307
354     ok 308
355     ok 309
356     ok 310
357     ok 311
358     ok 312
359     ok 313
360     ok 314
361     ok 315
362     ok 316
363     ok 317
364     ok 318
365     ok 319
366     ok 320
367     ok 321
368     ok 322
369     ok 323
370     ok 324
371     ok 325
372     ok 326
373     ok 327
374     ok 328
375     ok 329
376     ok 330
377     ok 331
378     ok 332
379     ok 333
380     ok 334
381     ok 335
382     ok 336
383     ok 337
384 wakaba 1.59 ok 338
385     ok 339
386     ok 340
387     ok 341
388     ok 342
389     ok 343
390     ok 344
391     ok 345
392     ok 346
393     ok 347
394 wakaba 1.62 ok 348
395     ok 349
396     ok 350
397     ok 351
398     ok 352
399     ok 353
400     ok 354
401     ok 355
402     ok 356
403     ok 357
404     ok 358
405     ok 359
406 wakaba 1.96 ok 360
407     ok 361
408     ok 362
409     ok 363
410 wakaba 1.129 ok 364
411     ok 365
412     ok 366
413     ok 367
414     ok 368
415     ok 369
416     ok 370
417     ok 371
418     ok 372
419     ok 373
420     ok 374
421     ok 375
422     ok 376
423     ok 377
424     ok 378
425     ok 379
426     ok 380
427     ok 381
428     ok 382
429     ok 383
430     ok 384
431     ok 385
432     ok 386
433     ok 387
434     ok 388
435     ok 389
436     ok 390
437     ok 391
438     ok 392
439     ok 393
440     ok 394
441     ok 395
442     ok 396
443 wakaba 1.130 ok 397
444     ok 398
445     ok 399
446     ok 400
447     ok 401
448     ok 402
449     ok 403
450     ok 404
451     ok 405
452     ok 406
453     ok 407
454     ok 408
455     ok 409
456     ok 410
457     ok 411
458     ok 412
459     ok 413
460     ok 414
461     ok 415
462     ok 416
463 wakaba 1.132 ok 417
464     ok 418
465     ok 419
466     ok 420
467 wakaba 1.136 ok 421
468     ok 422
469     ok 423
470     ok 424
471     ok 425
472     ok 426
473     ok 427
474     ok 428
475     ok 429
476     ok 430
477     ok 431
478     ok 432
479     ok 433
480     ok 434
481     ok 435
482     ok 436
483     ok 437
484     ok 438
485     ok 439
486     ok 440
487     ok 441
488     ok 442
489     ok 443
490     ok 444
491     ok 445
492     ok 446
493     ok 447
494     ok 448
495     ok 449
496     ok 450
497     ok 451
498     ok 452
499     ok 453
500     ok 454
501     ok 455
502     ok 456
503     ok 457
504     ok 458
505     ok 459
506     ok 460
507     ok 461
508     ok 462
509     ok 463
510     ok 464
511     ok 465
512     ok 466
513     ok 467
514     ok 468
515     ok 469
516     ok 470
517     ok 471
518 wakaba 1.141 ok 472
519 wakaba 1.195 ok 473
520     ok 474
521     ok 475
522     ok 476
523     ok 477

admin@suikawiki.org
ViewVC Help
Powered by ViewVC 1.1.24