Discussion:
storage, libaio, or XFS problem? 3.4.26
Stan Hoeppner
2014-08-26 06:18:50 UTC
Permalink
Had some controller issues but believe we had those ironed out before this
recent breakage. I had reformatted both 48TB LUNs on this test box with -f
and defaults, and fired up the test app again. Throughput was fantastic
with no dropped application buffers for ~30 minutes, and IO times were 1.x
ms max, suggesting all the writes were acked by controller cache. The app
was running perfectly. Then it hanged and spit out an internal error msg:


Failed to reopen file /mnt/VOL1/sg-04/str-0015/f-0000000005

I have many call traces in dmesg, most are XFS, pasted below. I found no
SCSI I/O errors in dmesg. Application is submitting writes via libaio and
using O_DIRECT, writing to preallocated files. I don't see any errors in
the storage controller log and everything seems to check out there. Using
noop elevator, mpt2sas, no multipathing. The application process is hung
in d state and kill -9 won't get rid of it. I can't unmount the hosed up
filesystem. Any ideas?

# ls -la /mnt/VOL1
ls: cannot access /mnt/VOL1: Input/output error

# dd if=/mnt/VOL1 of=/dev/null bs=1M count=16
dd: opening `/mnt/VOL1': Input/output error

# dd if=/dev/sdd of=/dev/null bs=1M count=16
16+0 records in
16+0 records out
16777216 bytes (17 MB) copied, 1.10989 s, 15.1 MB/s

# ls -la /mnt/
ls: cannot access /mnt/VOL1: Input/output error
total 8
drwxrwxrwt 6 root root 120 Aug 25 17:59 .
drwxr-xr-x 21 root root 4096 Jul 15 09:39 ..
drwxrwxrwt 3 root root 80 Aug 25 16:52 initramfs
drwxr-xr-x 3 root root 4096 Apr 24 04:57 scratch
drwxrwxrwx 5 root root 58 Aug 25 22:06 VOL0
d????????? ? ? ? ? ? VOL1

# umount /mnt/VOL1
umount: /mnt/VOL1: device is busy.
(In some cases useful info about processes that use
the device is found by lsof(8) or fuser(1))



Kernel 3.4.26
xfs_repair version 3.1.4
2 socket, 20 core Xeon(R) CPU E5-2650 v2 @ 2.60GHz

MemTotal: 264573184 kB
MemFree: 246519624 kB
Buffers: 16820 kB
Cached: 78120 kB
SwapCached: 0 kB
Active: 13130040 kB
Inactive: 75352 kB
Active(anon): 13110512 kB
Inactive(anon): 276 kB
Active(file): 19528 kB
Inactive(file): 75076 kB
Unevictable: 0 kB
Mlocked: 0 kB
SwapTotal: 0 kB
SwapFree: 0 kB
Dirty: 0 kB
Writeback: 0 kB
AnonPages: 13126968 kB
Mapped: 11848 kB
Shmem: 340 kB
Slab: 208476 kB
SReclaimable: 118240 kB
SUnreclaim: 90236 kB
KernelStack: 3976 kB
PageTables: 32840 kB
NFS_Unstable: 0 kB
Bounce: 0 kB
WritebackTmp: 0 kB
CommitLimit: 132286592 kB
Committed_AS: 25044716 kB
VmallocTotal: 34359738367 kB
VmallocUsed: 1050608 kB
VmallocChunk: 34358680400 kB
AnonHugePages: 13078528 kB
HugePages_Total: 0
HugePages_Free: 0
HugePages_Rsvd: 0
HugePages_Surp: 0
Hugepagesize: 2048 kB
DirectMap4k: 5056 kB
DirectMap2M: 2045952 kB
DirectMap1G: 266338304 kB

/dev/sdc /mnt/VOL0 xfs
rw,noatime,nodiratime,attr2,nobarrier,inode64,noquota 0 0
/dev/sdd /mnt/VOL1 xfs
rw,noatime,nodiratime,attr2,nobarrier,inode64,noquota 0 0

major minor #blocks name
8 32 46837141504 sdc
8 48 46837141504 sdd

2x hardware RAID5 LUNs - 64KB su, 768KB sw
2 controllers, one per LUN, 3GB cache each, write back, FPGA RAID engine
No LVM. LUNs directly formatted with XFS
26 Seagate SAS 3.5" 7.2K drives, 13 per RAID5, drive caches disabled
My counterpart had what I'd guess is this same problem on the full test
rig,
which has 16 of these LUNS, 208 drives total. He was also testing with
libaio.
I have not dug into the logs on that host as of yet. Its config is
identical
to this box but for the number of LUNs, drives.


[22635.102013] INFO: task kworker/7:0:45 blocked for more than 120
seconds.
[22635.102016] "echo 0 > /proc/sys/kernel/hung_task_timeout_secs" disables
this message.
[22635.102018] kworker/7:0 D ffff8840666c0b08 0 45 2
0x00000000
[22635.102021] ffff8840666e7bd0 0000000000000046 ffff883f7c02d000
ffff8840666f5180
[22635.102026] ffff8840666e7b80 0000000000000206 00000000000122c0
00000000000122c0
[22635.102030] ffff8840666e7fd8 ffff8840666e6000 00000000000122c0
ffff8840666e6000
[22635.102033] Call Trace:
[22635.102041] [<ffffffff814f5fd7>] schedule+0x64/0x66
[22635.102044] [<ffffffff814f66ec>] rwsem_down_failed_common+0xdb/0x10d
[22635.102047] [<ffffffff814f6731>] rwsem_down_write_failed+0x13/0x15
[22635.102051] [<ffffffff81261913>]
call_rwsem_down_write_failed+0x13/0x20
[22635.102055] [<ffffffff814f6a92>] ?
_raw_spin_unlock_irqrestore+0x30/0x3d
[22635.102058] [<ffffffff814f5458>] ? down_write+0x25/0x27
[22635.102070] [<ffffffffa01b35e4>] xfs_ilock+0x4f/0xb4 [xfs]
[22635.102077] [<ffffffffa01b5ab0>] xfs_iomap_write_unwritten+0x1b3/0x32e
[xfs]
[22635.102080] [<ffffffff814f6a92>] ?
_raw_spin_unlock_irqrestore+0x30/0x3d
[22635.102084] [<ffffffffa01ab3d2>] ? xfs_setfilesize+0x128/0x128 [xfs]
[22635.102088] [<ffffffff810bc602>] ? mempool_free+0x73/0x78
[22635.102093] [<ffffffffa01ab3d2>] ? xfs_setfilesize+0x128/0x128 [xfs]
[22635.102098] [<ffffffffa01ab45b>] xfs_end_io+0x89/0xb4 [xfs]
[22635.102102] [<ffffffff81047321>] process_one_work+0x204/0x327
[22635.102105] [<ffffffff8104757f>] worker_thread+0x13b/0x25a
[22635.102108] [<ffffffff81047444>] ? process_one_work+0x327/0x327
[22635.102111] [<ffffffff8104af6f>] kthread+0x89/0x91
[22635.102115] [<ffffffff814fdbd4>] kernel_thread_helper+0x4/0x10
[22635.102118] [<ffffffff8104aee6>] ? __init_kthread_worker+0x3c/0x3c
[22635.102120] [<ffffffff814fdbd0>] ? gs_change+0xb/0xb
[22635.102131] INFO: task streamRT-sa:5891 blocked for more than 120
seconds.
[22635.102132] "echo 0 > /proc/sys/kernel/hung_task_timeout_secs" disables
this message.
[22635.102134] streamRT-sa D ffff883f52850348 0 5891 5114
0x00000004
[22635.102136] ffff884063af5b28 0000000000000082 000000000000029c
ffff884066679100
[22635.102140] 0000000000000006 0000000000000001 00000000000122c0
00000000000122c0
[22635.102143] ffff884063af5fd8 ffff884063af4000 00000000000122c0
ffff884063af4000
[22635.102146] Call Trace:
[22635.102149] [<ffffffff814f5fd7>] schedule+0x64/0x66
[22635.102151] [<ffffffff814f66ec>] rwsem_down_failed_common+0xdb/0x10d
[22635.102154] [<ffffffff814f6731>] rwsem_down_write_failed+0x13/0x15
[22635.102156] [<ffffffff81261913>]
call_rwsem_down_write_failed+0x13/0x20
[22635.102158] [<ffffffff814f5458>] ? down_write+0x25/0x27
[22635.102164] [<ffffffffa01b35e4>] xfs_ilock+0x4f/0xb4 [xfs]
[22635.102170] [<ffffffffa01b00e5>] xfs_rw_ilock+0x2c/0x33 [xfs]
[22635.102172] [<ffffffff814f6ac6>] ? _raw_spin_unlock_irq+0x27/0x32
[22635.102178] [<ffffffffa01b0519>] xfs_file_aio_write_checks+0x41/0xfe
[xfs]
[22635.102184] [<ffffffffa01b06ff>] xfs_file_dio_aio_write+0x103/0x1fc
[xfs]
[22635.102189] [<ffffffffa01b0ac3>] xfs_file_aio_write+0x152/0x1b5 [xfs]
[22635.102195] [<ffffffffa01b0971>] ?
xfs_file_buffered_aio_write+0x179/0x179 [xfs]
[22635.102199] [<ffffffff81133694>] aio_rw_vect_retry+0x85/0x18a
[22635.102202] [<ffffffff8113360f>] ? aio_fsync+0x29/0x29
[22635.102204] [<ffffffff81134c10>] aio_run_iocb+0x7b/0x149
[22635.102207] [<ffffffff81134fe9>] io_submit_one+0x199/0x1f3
[22635.102209] [<ffffffff8113513d>] do_io_submit+0xfa/0x271
[22635.102212] [<ffffffff811352c4>] sys_io_submit+0x10/0x12
[22635.102215] [<ffffffff814fc912>] system_call_fastpath+0x16/0x1b
[22635.102217] INFO: task streamRT-sa:5895 blocked for more than 120
seconds.
[22635.102218] "echo 0 > /proc/sys/kernel/hung_task_timeout_secs" disables
this message.
[22635.102219] streamRT-sa D ffff883f537fa3c8 0 5895 5114
0x00000004
[22635.102222] ffff883f7d057b28 0000000000000082 0000000000000000
ffff884066227100
[22635.102225] ffff883f7d057ad8 0000000000000002 00000000000122c0
00000000000122c0
[22635.102228] ffff883f7d057fd8 ffff883f7d056000 00000000000122c0
ffff883f7d056000
[22635.102232] Call Trace:
[22635.102234] [<ffffffff814f5fd7>] schedule+0x64/0x66
[22635.102236] [<ffffffff814f66ec>] rwsem_down_failed_common+0xdb/0x10d
[22635.102239] [<ffffffff814f6731>] rwsem_down_write_failed+0x13/0x15
[22635.102241] [<ffffffff81261913>]
call_rwsem_down_write_failed+0x13/0x20
[22635.102243] [<ffffffff814f5458>] ? down_write+0x25/0x27
[22635.102249] [<ffffffffa01b35e4>] xfs_ilock+0x4f/0xb4 [xfs]
[22635.102254] [<ffffffffa01b00e5>] xfs_rw_ilock+0x2c/0x33 [xfs]
[22635.102257] [<ffffffff814f6a92>] ?
_raw_spin_unlock_irqrestore+0x30/0x3d
[22635.102262] [<ffffffffa01b0519>] xfs_file_aio_write_checks+0x41/0xfe
[xfs]
[22635.102268] [<ffffffffa01b06ff>] xfs_file_dio_aio_write+0x103/0x1fc
[xfs]
[22635.102274] [<ffffffffa01b0ac3>] xfs_file_aio_write+0x152/0x1b5 [xfs]
[22635.102279] [<ffffffffa01b0971>] ?
xfs_file_buffered_aio_write+0x179/0x179 [xfs]
[22635.102282] [<ffffffff81133694>] aio_rw_vect_retry+0x85/0x18a
[22635.102284] [<ffffffff8113360f>] ? aio_fsync+0x29/0x29
[22635.102286] [<ffffffff81134c10>] aio_run_iocb+0x7b/0x149
[22635.102288] [<ffffffff81134fe9>] io_submit_one+0x199/0x1f3
[22635.102291] [<ffffffff8113513d>] do_io_submit+0xfa/0x271
[22635.102294] [<ffffffff8103ed3f>] ? sys_rt_sigprocmask+0x69/0xa1
[22635.102297] [<ffffffff811352c4>] sys_io_submit+0x10/0x12
[22635.102299] [<ffffffff814fc912>] system_call_fastpath+0x16/0x1b
[22635.102301] INFO: task streamRT-sa:5900 blocked for more than 120
seconds.
[22635.102302] "echo 0 > /proc/sys/kernel/hung_task_timeout_secs" disables
this message.
[22635.102303] streamRT-sa D ffff883f53003b88 0 5900 5114
0x00000004
[22635.102305] ffff883f7d109b28 0000000000000082 0000000000000000
ffff884066679100
[22635.102309] ffff883f7d109ad8 0000000000000002 00000000000122c0
00000000000122c0
[22635.102312] ffff883f7d109fd8 ffff883f7d108000 00000000000122c0
ffff883f7d108000
[22635.102315] Call Trace:
[22635.102318] [<ffffffff814f5fd7>] schedule+0x64/0x66
[22635.102320] [<ffffffff814f66ec>] rwsem_down_failed_common+0xdb/0x10d
[22635.102322] [<ffffffff814f6731>] rwsem_down_write_failed+0x13/0x15
[22635.102324] [<ffffffff81261913>]
call_rwsem_down_write_failed+0x13/0x20
[22635.102327] [<ffffffff814f5458>] ? down_write+0x25/0x27
[22635.102332] [<ffffffffa01b35e4>] xfs_ilock+0x4f/0xb4 [xfs]
[22635.102338] [<ffffffffa01b00e5>] xfs_rw_ilock+0x2c/0x33 [xfs]
[22635.102340] [<ffffffff814f6a92>] ?
_raw_spin_unlock_irqrestore+0x30/0x3d
[22635.102346] [<ffffffffa01b0519>] xfs_file_aio_write_checks+0x41/0xfe
[xfs]
[22635.102351] [<ffffffffa01b06ff>] xfs_file_dio_aio_write+0x103/0x1fc
[xfs]
[22635.102357] [<ffffffffa01b0ac3>] xfs_file_aio_write+0x152/0x1b5 [xfs]
[22635.102362] [<ffffffff8105539b>] ? resched_task+0x3e/0x75
[22635.102367] [<ffffffffa01b0971>] ?
xfs_file_buffered_aio_write+0x179/0x179 [xfs]
[22635.102369] [<ffffffff81133694>] aio_rw_vect_retry+0x85/0x18a
[22635.102372] [<ffffffff8113360f>] ? aio_fsync+0x29/0x29
[22635.102374] [<ffffffff81134c10>] aio_run_iocb+0x7b/0x149
[22635.102376] [<ffffffff81134fe9>] io_submit_one+0x199/0x1f3
[22635.102379] [<ffffffff8113513d>] do_io_submit+0xfa/0x271
[22635.102382] [<ffffffff811352c4>] sys_io_submit+0x10/0x12
[22635.102384] [<ffffffff814fc912>] system_call_fastpath+0x16/0x1b
[22635.102386] INFO: task streamRT-sa:5904 blocked for more than 120
seconds.
[22635.102387] "echo 0 > /proc/sys/kernel/hung_task_timeout_secs" disables
this message.
[22635.102388] streamRT-sa D ffff88405fd50a48 0 5904 5114
0x00000004
[22635.102390] ffff883f51c2bb28 0000000000000082 0000000000000000
ffff8840662d0080
[22635.102394] ffff883f51c2bad8 0000000000000002 00000000000122c0
00000000000122c0
[22635.102397] ffff883f51c2bfd8 ffff883f51c2a000 00000000000122c0
ffff883f51c2a000
[22635.102400] Call Trace:
[22635.102402] [<ffffffff814f5fd7>] schedule+0x64/0x66
[22635.102405] [<ffffffff814f66ec>] rwsem_down_failed_common+0xdb/0x10d
[22635.102407] [<ffffffff814f6731>] rwsem_down_write_failed+0x13/0x15
[22635.102409] [<ffffffff81261913>]
call_rwsem_down_write_failed+0x13/0x20
[22635.102412] [<ffffffff814f5458>] ? down_write+0x25/0x27
[22635.102417] [<ffffffffa01b35e4>] xfs_ilock+0x4f/0xb4 [xfs]
[22635.102423] [<ffffffffa01b00e5>] xfs_rw_ilock+0x2c/0x33 [xfs]
[22635.102425] [<ffffffff814f6a92>] ?
_raw_spin_unlock_irqrestore+0x30/0x3d
[22635.102430] [<ffffffffa01b0519>] xfs_file_aio_write_checks+0x41/0xfe
[xfs]
[22635.102436] [<ffffffffa01b06ff>] xfs_file_dio_aio_write+0x103/0x1fc
[xfs]
[22635.102442] [<ffffffffa01b0ac3>] xfs_file_aio_write+0x152/0x1b5 [xfs]
[22635.102447] [<ffffffffa01b0971>] ?
xfs_file_buffered_aio_write+0x179/0x179 [xfs]
[22635.102450] [<ffffffff81133694>] aio_rw_vect_retry+0x85/0x18a
[22635.102452] [<ffffffff8113360f>] ? aio_fsync+0x29/0x29
[22635.102454] [<ffffffff81134c10>] aio_run_iocb+0x7b/0x149
[22635.102457] [<ffffffff81134fe9>] io_submit_one+0x199/0x1f3
[22635.102459] [<ffffffff8113513d>] do_io_submit+0xfa/0x271
[22635.102461] [<ffffffff8103ed3f>] ? sys_rt_sigprocmask+0x69/0xa1
[22635.102464] [<ffffffff811352c4>] sys_io_submit+0x10/0x12
[22635.102466] [<ffffffff814fc912>] system_call_fastpath+0x16/0x1b
[22635.102468] INFO: task streamRT-sa:5906 blocked for more than 120
seconds.
[22635.102469] "echo 0 > /proc/sys/kernel/hung_task_timeout_secs" disables
this message.
[22635.102470] streamRT-sa D ffff88405fe2ca88 0 5906 5114
0x00000004
[22635.102473] ffff883f51c2fb28 0000000000000082 0000000000000000
ffff8840667f00c0
[22635.102476] 0000000000000000 0000000000000000 00000000000122c0
00000000000122c0
[22635.102479] ffff883f51c2ffd8 ffff883f51c2e000 00000000000122c0
ffff883f51c2e000
[22635.102482] Call Trace:
[22635.102485] [<ffffffff814f5fd7>] schedule+0x64/0x66
[22635.102487] [<ffffffff814f66ec>] rwsem_down_failed_common+0xdb/0x10d
[22635.102489] [<ffffffff814f6731>] rwsem_down_write_failed+0x13/0x15
[22635.102491] [<ffffffff81261913>]
call_rwsem_down_write_failed+0x13/0x20
[22635.102494] [<ffffffff814f5458>] ? down_write+0x25/0x27
[22635.102499] [<ffffffffa01b35e4>] xfs_ilock+0x4f/0xb4 [xfs]
[22635.102505] [<ffffffffa01b00e5>] xfs_rw_ilock+0x2c/0x33 [xfs]
[22635.102507] [<ffffffff814f6ac6>] ? _raw_spin_unlock_irq+0x27/0x32
[22635.102512] [<ffffffffa01b0519>] xfs_file_aio_write_checks+0x41/0xfe
[xfs]
[22635.102518] [<ffffffffa01b06ff>] xfs_file_dio_aio_write+0x103/0x1fc
[xfs]
[22635.102524] [<ffffffffa01b0ac3>] xfs_file_aio_write+0x152/0x1b5 [xfs]
[22635.102529] [<ffffffffa01b0971>] ?
xfs_file_buffered_aio_write+0x179/0x179 [xfs]
[22635.102532] [<ffffffff81133694>] aio_rw_vect_retry+0x85/0x18a
[22635.102534] [<ffffffff8113360f>] ? aio_fsync+0x29/0x29
[22635.102536] [<ffffffff81134c10>] aio_run_iocb+0x7b/0x149
[22635.102538] [<ffffffff81134fe9>] io_submit_one+0x199/0x1f3
[22635.102541] [<ffffffff8113513d>] do_io_submit+0xfa/0x271
[22635.102543] [<ffffffff8103ed3f>] ? sys_rt_sigprocmask+0x69/0xa1
[22635.102545] [<ffffffff811352c4>] sys_io_submit+0x10/0x12
[22635.102548] [<ffffffff814fc912>] system_call_fastpath+0x16/0x1b
[22635.102550] INFO: task streamRT-sa:5908 blocked for more than 120
seconds.
[22635.102551] "echo 0 > /proc/sys/kernel/hung_task_timeout_secs" disables
this message.
[22635.102552] streamRT-sa D ffff88405fe2fac8 0 5908 5114
0x00000004
[22635.102554] ffff883f518b3b28 0000000000000082 000000000000029c
ffff884066770040
[22635.102558] 0000000000000006 0000000000000001 00000000000122c0
00000000000122c0
[22635.102561] ffff883f518b3fd8 ffff883f518b2000 00000000000122c0
ffff883f518b2000
[22635.102564] Call Trace:
[22635.102566] [<ffffffff814f5fd7>] schedule+0x64/0x66
[22635.102569] [<ffffffff814f66ec>] rwsem_down_failed_common+0xdb/0x10d
[22635.102571] [<ffffffff814f6731>] rwsem_down_write_failed+0x13/0x15
[22635.102573] [<ffffffff81261913>]
call_rwsem_down_write_failed+0x13/0x20
[22635.102575] [<ffffffff814f5458>] ? down_write+0x25/0x27
[22635.102581] [<ffffffffa01b35e4>] xfs_ilock+0x4f/0xb4 [xfs]
[22635.102586] [<ffffffffa01b00e5>] xfs_rw_ilock+0x2c/0x33 [xfs]
[22635.102589] [<ffffffff814f6ac6>] ? _raw_spin_unlock_irq+0x27/0x32
[22635.102594] [<ffffffffa01b0519>] xfs_file_aio_write_checks+0x41/0xfe
[xfs]
[22635.102600] [<ffffffffa01b06ff>] xfs_file_dio_aio_write+0x103/0x1fc
[xfs]
[22635.102620] [<ffffffffa01b0ac3>] xfs_file_aio_write+0x152/0x1b5 [xfs]
[22635.102627] [<ffffffffa01b0971>] ?
xfs_file_buffered_aio_write+0x179/0x179 [xfs]
[22635.102630] [<ffffffff81133694>] aio_rw_vect_retry+0x85/0x18a
[22635.102637] [<ffffffff8113360f>] ? aio_fsync+0x29/0x29
[22635.102641] [<ffffffff81134c10>] aio_run_iocb+0x7b/0x149
[22635.102651] [<ffffffff81134fe9>] io_submit_one+0x199/0x1f3
[22635.102655] [<ffffffff8113513d>] do_io_submit+0xfa/0x271
[22635.102661] [<ffffffff811352c4>] sys_io_submit+0x10/0x12
[22635.102664] [<ffffffff814fc912>] system_call_fastpath+0x16/0x1b
[22635.102669] INFO: task streamRT-sa:5909 blocked for more than 120
seconds.
[22635.102675] "echo 0 > /proc/sys/kernel/hung_task_timeout_secs" disables
this message.
[22635.102677] streamRT-sa D ffff88405fe2f3c8 0 5909 5114
0x00000004
[22635.102683] ffff883f518b5b28 0000000000000082 00000000000000de
ffff88406663a040
[22635.102694] 0000000000000002 0000000000000001 00000000000122c0
00000000000122c0
[22635.102705] ffff883f518b5fd8 ffff883f518b4000 00000000000122c0
ffff883f518b4000
[22635.102719] Call Trace:
[22635.102723] [<ffffffff814f5fd7>] schedule+0x64/0x66
[22635.102729] [<ffffffff814f66ec>] rwsem_down_failed_common+0xdb/0x10d
[22635.102734] [<ffffffff814f6731>] rwsem_down_write_failed+0x13/0x15
[22635.102739] [<ffffffff81261913>]
call_rwsem_down_write_failed+0x13/0x20
[22635.102742] [<ffffffff814f5458>] ? down_write+0x25/0x27
[22635.102751] [<ffffffffa01b35e4>] xfs_ilock+0x4f/0xb4 [xfs]
[22635.102761] [<ffffffffa01b00e5>] xfs_rw_ilock+0x2c/0x33 [xfs]
[22635.102765] [<ffffffff814f6ac6>] ? _raw_spin_unlock_irq+0x27/0x32
[22635.102773] [<ffffffffa01b0519>] xfs_file_aio_write_checks+0x41/0xfe
[xfs]
[22635.102781] [<ffffffffa01b06ff>] xfs_file_dio_aio_write+0x103/0x1fc
[xfs]
[22635.102788] [<ffffffffa01b0ac3>] xfs_file_aio_write+0x152/0x1b5 [xfs]
[22635.102798] [<ffffffffa01b0971>] ?
xfs_file_buffered_aio_write+0x179/0x179 [xfs]
[22635.102801] [<ffffffff81133694>] aio_rw_vect_retry+0x85/0x18a
[22635.102805] [<ffffffff8113360f>] ? aio_fsync+0x29/0x29
[22635.102809] [<ffffffff81134c10>] aio_run_iocb+0x7b/0x149
[22635.102812] [<ffffffff81134fe9>] io_submit_one+0x199/0x1f3
[22635.102815] [<ffffffff8113513d>] do_io_submit+0xfa/0x271
[22635.102818] [<ffffffff8103ed3f>] ? sys_rt_sigprocmask+0x69/0xa1
[22635.102824] [<ffffffff811352c4>] sys_io_submit+0x10/0x12
[22635.102827] [<ffffffff814fc912>] system_call_fastpath+0x16/0x1b
[22635.102834] INFO: task streamRT-sa:5911 blocked for more than 120
seconds.
[22635.102836] "echo 0 > /proc/sys/kernel/hung_task_timeout_secs" disables
this message.
[22635.102838] streamRT-sa D ffff883f5372c408 0 5911 5114
0x00000004
[22635.102844] ffff883f5208bb28 0000000000000082 0000000000000000
ffff884066001180
[22635.102850] 0000000000000000 0000000000000000 00000000000122c0
00000000000122c0
[22635.102861] ffff883f5208bfd8 ffff883f5208a000 00000000000122c0
ffff883f5208a000
[22635.102877] Call Trace:
[22635.102883] [<ffffffff814f5fd7>] schedule+0x64/0x66
[22635.102886] [<ffffffff814f66ec>] rwsem_down_failed_common+0xdb/0x10d
[22635.102890] [<ffffffff814f6731>] rwsem_down_write_failed+0x13/0x15
[22635.102895] [<ffffffff81261913>]
call_rwsem_down_write_failed+0x13/0x20
[22635.102902] [<ffffffff814f5458>] ? down_write+0x25/0x27
[22635.102911] [<ffffffffa01b35e4>] xfs_ilock+0x4f/0xb4 [xfs]
[22635.102918] [<ffffffffa01b00e5>] xfs_rw_ilock+0x2c/0x33 [xfs]
[22635.102924] [<ffffffff814f6ac6>] ? _raw_spin_unlock_irq+0x27/0x32
[22635.102931] [<ffffffffa01b0519>] xfs_file_aio_write_checks+0x41/0xfe
[xfs]
[22635.102939] [<ffffffffa01b06ff>] xfs_file_dio_aio_write+0x103/0x1fc
[xfs]
[22635.102947] [<ffffffffa01b0ac3>] xfs_file_aio_write+0x152/0x1b5 [xfs]
[22635.102954] [<ffffffffa01b0971>] ?
xfs_file_buffered_aio_write+0x179/0x179 [xfs]
[22635.102959] [<ffffffff81133694>] aio_rw_vect_retry+0x85/0x18a
[22635.102962] [<ffffffff8113360f>] ? aio_fsync+0x29/0x29
[22635.102968] [<ffffffff81134c10>] aio_run_iocb+0x7b/0x149
[22635.102973] [<ffffffff81134fe9>] io_submit_one+0x199/0x1f3
[22635.102976] [<ffffffff8113513d>] do_io_submit+0xfa/0x271
[22635.102980] [<ffffffff8103ed3f>] ? sys_rt_sigprocmask+0x69/0xa1
[22635.102985] [<ffffffff811352c4>] sys_io_submit+0x10/0x12
[22635.102989] [<ffffffff814fc912>] system_call_fastpath+0x16/0x1b
[22635.102995] INFO: task streamRT-sa:5918 blocked for more than 120
seconds.
[22635.102997] "echo 0 > /proc/sys/kernel/hung_task_timeout_secs" disables
this message.
[22635.103001] streamRT-sa D ffff883f53733a48 0 5918 5114
0x00000004
[22635.103007] ffff884063d99b28 0000000000000082 0000000000000000
ffff88406668b1c0
[22635.103017] 0000000000000000 0000000000000000 00000000000122c0
00000000000122c0
[22635.103029] ffff884063d99fd8 ffff884063d98000 00000000000122c0
ffff884063d98000
[22635.103037] Call Trace:
[22635.103042] [<ffffffff814f5fd7>] schedule+0x64/0x66
[22635.103047] [<ffffffff814f66ec>] rwsem_down_failed_common+0xdb/0x10d
[22635.103050] [<ffffffff814f6731>] rwsem_down_write_failed+0x13/0x15
[22635.103054] [<ffffffff81261913>]
call_rwsem_down_write_failed+0x13/0x20
[22635.103058] [<ffffffff814f5458>] ? down_write+0x25/0x27
[22635.103065] [<ffffffffa01b35e4>] xfs_ilock+0x4f/0xb4 [xfs]
[22635.103072] [<ffffffffa01b00e5>] xfs_rw_ilock+0x2c/0x33 [xfs]
[22635.103076] [<ffffffff814f6ac6>] ? _raw_spin_unlock_irq+0x27/0x32
[22635.103086] [<ffffffffa01b0519>] xfs_file_aio_write_checks+0x41/0xfe
[xfs]
[22635.103092] [<ffffffffa01b06ff>] xfs_file_dio_aio_write+0x103/0x1fc
[xfs]
[22635.103099] [<ffffffffa01b0ac3>] xfs_file_aio_write+0x152/0x1b5 [xfs]
[22635.103109] [<ffffffffa01b0971>] ?
xfs_file_buffered_aio_write+0x179/0x179 [xfs]
[22635.103113] [<ffffffff81133694>] aio_rw_vect_retry+0x85/0x18a
[22635.103120] [<ffffffff8113360f>] ? aio_fsync+0x29/0x29
[22635.103126] [<ffffffff81134c10>] aio_run_iocb+0x7b/0x149
[22635.103129] [<ffffffff81134fe9>] io_submit_one+0x199/0x1f3
[22635.103134] [<ffffffff8113513d>] do_io_submit+0xfa/0x271
[22635.103139] [<ffffffff8103ed3f>] ? sys_rt_sigprocmask+0x69/0xa1
[22635.103142] [<ffffffff811352c4>] sys_io_submit+0x10/0x12
[22635.103146] [<ffffffff814fc912>] system_call_fastpath+0x16/0x1b

# ps -ef
UID PID PPID C STIME TTY TIME CMD
root 1 0 0 Aug25 ? 00:00:05 init [3]
root 2 0 0 Aug25 ? 00:00:00 [kthreadd]
root 3 2 0 Aug25 ? 00:00:00 [ksoftirqd/0]
root 6 2 0 Aug25 ? 00:00:00 [migration/0]
root 7 2 0 Aug25 ? 00:00:00 [rcuc/0]
root 8 2 0 Aug25 ? 00:00:00 [rcun/0]
root 9 2 0 Aug25 ? 00:00:00 [rcub/0]
root 10 2 0 Aug25 ? 00:00:00 [rcun/1]
root 11 2 0 Aug25 ? 00:00:00 [rcub/1]
root 12 2 0 Aug25 ? 00:00:00 [watchdog/0]
root 13 2 0 Aug25 ? 00:00:00 [migration/1]
root 15 2 0 Aug25 ? 00:00:00 [rcuc/1]
root 16 2 0 Aug25 ? 00:00:00 [ksoftirqd/1]
root 18 2 0 Aug25 ? 00:00:00 [watchdog/1]
root 19 2 0 Aug25 ? 00:00:00 [migration/2]
root 20 2 0 Aug25 ? 00:00:00 [kworker/2:0]
root 21 2 0 Aug25 ? 00:00:00 [rcuc/2]
root 22 2 0 Aug25 ? 00:00:00 [ksoftirqd/2]
root 23 2 0 Aug25 ? 00:00:00 [watchdog/2]
root 24 2 0 Aug25 ? 00:00:00 [migration/3]
root 26 2 0 Aug25 ? 00:00:00 [rcuc/3]
root 27 2 0 Aug25 ? 00:00:00 [ksoftirqd/3]
root 28 2 0 Aug25 ? 00:00:00 [watchdog/3]
root 29 2 0 Aug25 ? 00:00:00 [migration/4]
root 30 2 0 Aug25 ? 00:00:00 [kworker/4:0]
root 31 2 0 Aug25 ? 00:00:00 [rcuc/4]
root 32 2 0 Aug25 ? 00:00:00 [ksoftirqd/4]
root 33 2 0 Aug25 ? 00:00:00 [watchdog/4]
root 34 2 0 Aug25 ? 00:00:00 [migration/5]
root 36 2 0 Aug25 ? 00:00:00 [rcuc/5]
root 37 2 0 Aug25 ? 00:00:00 [ksoftirqd/5]
root 38 2 0 Aug25 ? 00:00:00 [watchdog/5]
root 39 2 0 Aug25 ? 00:00:00 [migration/6]
root 40 2 0 Aug25 ? 00:00:00 [kworker/6:0]
root 41 2 0 Aug25 ? 00:00:00 [rcuc/6]
root 42 2 0 Aug25 ? 00:00:00 [ksoftirqd/6]
root 43 2 0 Aug25 ? 00:00:00 [watchdog/6]
root 44 2 0 Aug25 ? 00:00:00 [migration/7]
root 45 2 0 Aug25 ? 00:00:00 [kworker/7:0]
root 46 2 0 Aug25 ? 00:00:00 [rcuc/7]
root 47 2 0 Aug25 ? 00:00:00 [ksoftirqd/7]
root 48 2 0 Aug25 ? 00:00:00 [watchdog/7]
root 49 2 0 Aug25 ? 00:00:00 [migration/8]
root 50 2 0 Aug25 ? 00:00:00 [kworker/8:0]
root 51 2 0 Aug25 ? 00:00:00 [rcuc/8]
root 52 2 0 Aug25 ? 00:00:00 [ksoftirqd/8]
root 53 2 0 Aug25 ? 00:00:00 [watchdog/8]
root 54 2 0 Aug25 ? 00:00:00 [migration/9]
root 56 2 0 Aug25 ? 00:00:00 [rcuc/9]
root 57 2 0 Aug25 ? 00:00:00 [ksoftirqd/9]
root 58 2 0 Aug25 ? 00:00:00 [watchdog/9]
root 59 2 0 Aug25 ? 00:00:00 [migration/10]
root 60 2 0 Aug25 ? 00:00:00 [kworker/10:0]
root 61 2 0 Aug25 ? 00:00:00 [rcuc/10]
root 62 2 0 Aug25 ? 00:00:00 [ksoftirqd/10]
root 63 2 0 Aug25 ? 00:00:00 [watchdog/10]
root 64 2 0 Aug25 ? 00:00:00 [migration/11]
root 66 2 0 Aug25 ? 00:00:00 [rcuc/11]
root 67 2 0 Aug25 ? 00:00:00 [ksoftirqd/11]
root 68 2 0 Aug25 ? 00:00:00 [watchdog/11]
root 69 2 0 Aug25 ? 00:00:00 [migration/12]
root 70 2 0 Aug25 ? 00:00:00 [kworker/12:0]
root 71 2 0 Aug25 ? 00:00:00 [rcuc/12]
root 72 2 0 Aug25 ? 00:00:00 [ksoftirqd/12]
root 73 2 0 Aug25 ? 00:00:00 [watchdog/12]
root 74 2 0 Aug25 ? 00:00:00 [migration/13]
root 76 2 0 Aug25 ? 00:00:00 [rcuc/13]
root 77 2 0 Aug25 ? 00:00:00 [ksoftirqd/13]
root 78 2 0 Aug25 ? 00:00:00 [watchdog/13]
root 79 2 0 Aug25 ? 00:00:00 [migration/14]
root 80 2 0 Aug25 ? 00:00:00 [kworker/14:0]
root 81 2 0 Aug25 ? 00:00:00 [rcuc/14]
root 82 2 0 Aug25 ? 00:00:00 [ksoftirqd/14]
root 83 2 0 Aug25 ? 00:00:00 [watchdog/14]
root 84 2 0 Aug25 ? 00:00:00 [migration/15]
root 86 2 0 Aug25 ? 00:00:00 [rcuc/15]
root 87 2 0 Aug25 ? 00:00:00 [ksoftirqd/15]
root 88 2 0 Aug25 ? 00:00:00 [watchdog/15]
root 89 2 0 Aug25 ? 00:00:00 [migration/16]
root 90 2 0 Aug25 ? 00:00:00 [kworker/16:0]
root 91 2 0 Aug25 ? 00:00:00 [rcuc/16]
root 92 2 0 Aug25 ? 00:00:00 [rcun/2]
root 93 2 0 Aug25 ? 00:00:00 [rcub/2]
root 94 2 0 Aug25 ? 00:00:00 [ksoftirqd/16]
root 95 2 0 Aug25 ? 00:00:00 [watchdog/16]
root 96 2 0 Aug25 ? 00:00:00 [migration/17]
root 98 2 0 Aug25 ? 00:00:00 [rcuc/17]
root 99 2 0 Aug25 ? 00:00:00 [ksoftirqd/17]
root 100 2 0 Aug25 ? 00:00:00 [watchdog/17]
root 101 2 0 Aug25 ? 00:00:00 [migration/18]
root 102 2 0 Aug25 ? 00:00:00 [kworker/18:0]
root 103 2 0 Aug25 ? 00:00:00 [rcuc/18]
root 104 2 0 Aug25 ? 00:00:00 [ksoftirqd/18]
root 105 2 0 Aug25 ? 00:00:00 [watchdog/18]
root 106 2 0 Aug25 ? 00:00:00 [migration/19]
root 107 2 0 Aug25 ? 00:00:00 [kworker/19:0]
root 108 2 0 Aug25 ? 00:00:00 [rcuc/19]
root 109 2 0 Aug25 ? 00:00:00 [ksoftirqd/19]
root 110 2 0 Aug25 ? 00:00:00 [watchdog/19]
root 111 2 0 Aug25 ? 00:00:00 [migration/20]
root 112 2 0 Aug25 ? 00:00:00 [kworker/20:0]
root 113 2 0 Aug25 ? 00:00:00 [rcuc/20]
root 114 2 0 Aug25 ? 00:00:00 [ksoftirqd/20]
root 115 2 0 Aug25 ? 00:00:00 [watchdog/20]
root 116 2 0 Aug25 ? 00:00:00 [migration/21]
root 117 2 0 Aug25 ? 00:00:00 [kworker/21:0]
root 118 2 0 Aug25 ? 00:00:00 [rcuc/21]
root 119 2 0 Aug25 ? 00:00:00 [ksoftirqd/21]
root 120 2 0 Aug25 ? 00:00:00 [watchdog/21]
root 121 2 0 Aug25 ? 00:00:00 [migration/22]
root 122 2 0 Aug25 ? 00:00:00 [kworker/22:0]
root 123 2 0 Aug25 ? 00:00:00 [rcuc/22]
root 124 2 0 Aug25 ? 00:00:00 [ksoftirqd/22]
root 125 2 0 Aug25 ? 00:00:00 [watchdog/22]
root 126 2 0 Aug25 ? 00:00:00 [migration/23]
root 128 2 0 Aug25 ? 00:00:00 [rcuc/23]
root 129 2 0 Aug25 ? 00:00:00 [ksoftirqd/23]
root 130 2 0 Aug25 ? 00:00:00 [watchdog/23]
root 131 2 0 Aug25 ? 00:00:00 [migration/24]
root 132 2 0 Aug25 ? 00:00:00 [kworker/24:0]
root 133 2 0 Aug25 ? 00:00:00 [rcuc/24]
root 134 2 0 Aug25 ? 00:00:00 [ksoftirqd/24]
root 135 2 0 Aug25 ? 00:00:00 [watchdog/24]
root 136 2 0 Aug25 ? 00:00:00 [migration/25]
root 138 2 0 Aug25 ? 00:00:00 [rcuc/25]
root 139 2 0 Aug25 ? 00:00:00 [ksoftirqd/25]
root 140 2 0 Aug25 ? 00:00:00 [watchdog/25]
root 141 2 0 Aug25 ? 00:00:00 [migration/26]
root 142 2 0 Aug25 ? 00:00:00 [kworker/26:0]
root 143 2 0 Aug25 ? 00:00:00 [rcuc/26]
root 144 2 0 Aug25 ? 00:00:00 [ksoftirqd/26]
root 145 2 0 Aug25 ? 00:00:00 [watchdog/26]
root 146 2 0 Aug25 ? 00:00:00 [migration/27]
root 147 2 0 Aug25 ? 00:00:00 [kworker/27:0]
root 148 2 0 Aug25 ? 00:00:00 [rcuc/27]
root 149 2 0 Aug25 ? 00:00:00 [ksoftirqd/27]
root 150 2 0 Aug25 ? 00:00:00 [watchdog/27]
root 151 2 0 Aug25 ? 00:00:00 [migration/28]
root 152 2 0 Aug25 ? 00:00:00 [kworker/28:0]
root 153 2 0 Aug25 ? 00:00:00 [rcuc/28]
root 154 2 0 Aug25 ? 00:00:00 [ksoftirqd/28]
root 155 2 0 Aug25 ? 00:00:00 [watchdog/28]
root 156 2 0 Aug25 ? 00:00:00 [migration/29]
root 157 2 0 Aug25 ? 00:00:00 [kworker/29:0]
root 158 2 0 Aug25 ? 00:00:00 [rcuc/29]
root 159 2 0 Aug25 ? 00:00:00 [ksoftirqd/29]
root 160 2 0 Aug25 ? 00:00:00 [watchdog/29]
root 161 2 0 Aug25 ? 00:00:00 [migration/30]
root 162 2 0 Aug25 ? 00:00:00 [kworker/30:0]
root 163 2 0 Aug25 ? 00:00:00 [rcuc/30]
root 164 2 0 Aug25 ? 00:00:00 [ksoftirqd/30]
root 165 2 0 Aug25 ? 00:00:00 [watchdog/30]
root 166 2 0 Aug25 ? 00:00:00 [migration/31]
root 167 2 0 Aug25 ? 00:00:00 [kworker/31:0]
root 168 2 0 Aug25 ? 00:00:00 [rcuc/31]
root 169 2 0 Aug25 ? 00:00:00 [ksoftirqd/31]
root 170 2 0 Aug25 ? 00:00:00 [watchdog/31]
root 171 2 0 Aug25 ? 00:00:00 [cpuset]
root 172 2 0 Aug25 ? 00:00:00 [khelper]
root 173 2 0 Aug25 ? 00:00:00 [netns]
root 418 2 0 Aug25 ? 00:00:00 [sync_supers]
root 420 2 0 Aug25 ? 00:00:00 [bdi-default]
root 422 2 0 Aug25 ? 00:00:00 [kblockd]
root 714 2 0 Aug25 ? 00:00:00 [ata_sff]
root 724 2 0 Aug25 ? 00:00:00 [khubd]
root 733 2 0 Aug25 ? 00:00:00 [kworker/30:1]
root 735 2 0 Aug25 ? 00:00:00 [kworker/28:1]
root 737 2 0 Aug25 ? 00:00:00 [kworker/26:1]
root 739 2 0 Aug25 ? 00:00:00 [kworker/24:1]
root 741 2 0 Aug25 ? 00:00:00 [kworker/22:1]
root 743 2 0 Aug25 ? 00:00:00 [kworker/20:1]
root 745 2 0 Aug25 ? 00:00:00 [kworker/18:1]
root 747 2 0 Aug25 ? 00:00:00 [kworker/16:1]
root 749 2 0 Aug25 ? 00:00:00 [kworker/14:1]
root 751 2 0 Aug25 ? 00:00:00 [kworker/12:1]
root 757 2 0 Aug25 ? 00:00:00 [kworker/10:1]
root 759 2 0 Aug25 ? 00:00:00 [kworker/8:1]
root 761 2 0 Aug25 ? 00:00:00 [kworker/6:1]
root 763 2 0 Aug25 ? 00:00:00 [kworker/4:1]
root 866 2 0 Aug25 ? 00:00:00 [rpciod]
root 868 2 0 Aug25 ? 00:00:00 [kworker/2:1]
root 975 2 0 Aug25 ? 00:00:00 [khungtaskd]
root 981 2 0 Aug25 ? 00:00:00 [kswapd0]
root 982 2 0 Aug25 ? 00:00:00 [khugepaged]
root 983 2 0 Aug25 ? 00:00:00 [fsnotify_mark]
root 984 2 0 Aug25 ? 00:00:00 [nfsiod]
root 986 2 0 Aug25 ? 00:00:00 [crypto]
root 1151 2 0 Aug25 ? 00:00:00 [kpsmoused]
root 1167 2 0 Aug25 ? 00:00:00 [deferwq]
root 1913 2 0 Aug25 ? 00:00:00 [bond0]
root 2144 1 0 Aug25 ? 00:00:00 udevd --daemon
root 2525 2 0 Aug25 ? 00:00:00 [scsi_eh_0]
root 2537 2 0 Aug25 ? 00:00:00 [mlx4]
root 2540 2 0 Aug25 ? 00:00:00 [scsi_eh_1]
root 2541 2 0 Aug25 ? 00:00:00 [fw_event0]
root 2595 2 0 Aug25 ? 00:00:00 [scsi_eh_2]
root 2596 2 0 Aug25 ? 00:00:00 [scsi_eh_3]
root 2597 2 0 Aug25 ? 00:00:00 [scsi_eh_4]
root 2598 2 0 Aug25 ? 00:00:00 [scsi_eh_5]
root 2599 2 0 Aug25 ? 00:00:00 [scsi_eh_6]
root 2600 2 0 Aug25 ? 00:00:00 [scsi_eh_7]
root 2667 2 0 Aug25 ? 00:00:00 [mlx4_en]
root 2797 2144 0 Aug25 ? 00:00:00 udevd --daemon
root 2798 2144 0 Aug25 ? 00:00:00 udevd --daemon
root 2800 2 0 Aug25 ? 00:00:00 [mlx4_en]
root 2817 2 0 Aug25 ? 00:00:00 [mlx4_en]
root 2843 2 0 Aug25 ? 00:00:00 [poll_0_status]
root 2844 2 0 Aug25 ? 00:00:00 [scsi_eh_8]
root 2846 2 0 Aug25 ? 00:00:00 [fw_event1]
root 2860 2 0 Aug25 ? 00:00:00 [poll_1_status]
root 2861 2 0 Aug25 ? 00:00:00 [scsi_eh_9]
root 2862 2 0 Aug25 ? 00:00:00 [fw_event2]
root 2864 2 0 Aug25 ? 00:00:00 [poll_2_status]
root 2865 2 0 Aug25 ? 00:00:00 [scsi_eh_10]
root 2866 2 0 Aug25 ? 00:00:00 [fw_event3]
root 2867 2 0 Aug25 ? 00:00:00 [poll_3_status]
root 4059 2 0 Aug25 ? 00:00:00 [bond1]
root 4464 2 0 Aug25 ? 00:00:00 [kdmflush]
root 4470 2 0 Aug25 ? 00:00:00 [kdmflush]
root 4476 2 0 Aug25 ? 00:00:00 [kdmflush]
root 4482 2 0 Aug25 ? 00:00:00 [kdmflush]
root 4488 2 0 Aug25 ? 00:00:00 [kdmflush]
root 4494 2 0 Aug25 ? 00:00:00 [kdmflush]
root 4500 2 0 Aug25 ? 00:00:00 [kdmflush]
root 4506 2 0 Aug25 ? 00:00:00 [kdmflush]
root 4512 2 0 Aug25 ? 00:00:00 [kdmflush]
root 4518 2 0 Aug25 ? 00:00:00 [kdmflush]
root 4527 2 0 Aug25 ? 00:00:00 [jbd2/dm-4-8]
root 4528 2 0 Aug25 ? 00:00:00 [ext4-dio-unwrit]
root 4593 1 0 Aug25 ? 00:00:00 /usr/sbin/rsyslogd -c5
root 4627 1 0 Aug25 ? 00:00:00 /usr/sbin/sshd
root 4645 1 0 Aug25 ? 00:00:00 /usr/sbin/acpid
root 4683 1 0 Aug25 ? 00:00:00 /usr/sbin/cron
ntp 4713 1 0 Aug25 ? 00:00:00 /usr/sbin/ntpd -g -c
/etc/ntp.conf -p /var/run/ntpd.pid -u ntp:ntp
daemon 4747 1 0 Aug25 ? 00:00:00 /sbin/portmap
root 4773 1 0 Aug25 ? 00:00:00 /sbin/rpc.statd
root 4796 1 0 Aug25 ttyS0 00:00:00 /sbin/getty -L 115200
ttyS0 vt100
root 4797 1 0 Aug25 ? 00:00:22 initBC

root 4908 2 0 Aug25 ? 00:00:00 [xfsalloc]
root 4909 2 0 Aug25 ? 00:00:00 [xfs_mru_cache]
root 4910 2 0 Aug25 ? 00:00:00 [xfslogd]
root 4930 2 0 Aug25 ? 00:00:00 [xfsbufd/sdc]
root 4931 2 0 Aug25 ? 00:00:00 [xfs-data/sdc]
root 4932 2 0 Aug25 ? 00:00:00 [xfs-conv/sdc]
root 4933 2 0 Aug25 ? 00:00:01 [xfsaild/sdc]
root 4935 2 0 Aug25 ? 00:00:00 [xfsbufd/sdd]
root 4936 2 0 Aug25 ? 00:00:00 [xfs-data/sdd]
root 4937 2 0 Aug25 ? 00:00:00 [xfs-conv/sdd]
root 4938 2 0 Aug25 ? 00:00:01 [xfsaild/sdd]
root 5115 1 6 Aug25 ? 00:11:28 [streamRT-sa] <defunct>
root 6582 2 0 Aug25 ? 00:00:00 [kworker/1:2]
root 6633 2 0 Aug25 ? 00:00:00 [kworker/23:1]
root 6636 2 0 Aug25 ? 00:00:00 [kworker/15:0]
root 6640 2 0 Aug25 ? 00:00:00 [kworker/29:2]
root 6647 2 0 Aug25 ? 00:00:00 [kworker/3:1]
root 6649 2 0 Aug25 ? 00:00:00 [kworker/25:2]
root 6654 2 0 Aug25 ? 00:00:00 [kworker/1:0]
root 6658 2 0 Aug25 ? 00:00:00 [kworker/19:1]
root 6659 2 0 Aug25 ? 00:00:00 [kworker/21:1]
root 6661 2 0 Aug25 ? 00:00:00 [kworker/27:2]
root 6662 2 0 Aug25 ? 00:00:00 [kworker/15:2]
root 6663 2 0 Aug25 ? 00:00:00 [kworker/5:2]
root 6666 2 0 Aug25 ? 00:00:00 [kworker/31:2]
root 6671 2 0 Aug25 ? 00:00:00 [kworker/0:4]
root 6673 2 0 Aug25 ? 00:00:00 [kworker/7:3]
root 6677 2 0 Aug25 ? 00:00:00 [kworker/0:6]
root 6678 2 0 Aug25 ? 00:00:00 [kworker/0:7]
root 6681 2 0 Aug25 ? 00:00:00 [kworker/0:10]
root 6684 2 0 Aug25 ? 00:00:00 [kworker/13:5]
root 6690 2 0 Aug25 ? 00:00:00 [kworker/9:4]
root 6695 2 0 Aug25 ? 00:00:00 [kworker/7:5]
root 6696 2 0 Aug25 ? 00:00:00 [kworker/13:7]
root 6697 2 0 Aug25 ? 00:00:00 [kworker/9:6]
root 6702 2 0 Aug25 ? 00:00:00 [kworker/0:11]
root 6705 2 0 Aug25 ? 00:00:00 [kworker/0:14]
root 6707 2 0 Aug25 ? 00:00:00 [kworker/0:16]
root 6719 2 0 Aug25 ? 00:00:00 [kworker/0:24]
root 6721 2 0 Aug25 ? 00:00:00 [kworker/0:26]
root 6723 2 0 Aug25 ? 00:00:00 [kworker/9:8]
root 6726 2 0 Aug25 ? 00:00:00 [kworker/31:5]
root 6731 2 0 Aug25 ? 00:00:00 [kworker/1:7]
root 6734 2 0 Aug25 ? 00:00:00 [kworker/11:6]
root 6736 2 0 Aug25 ? 00:00:00 [kworker/13:10]
root 6740 2 0 Aug25 ? 00:00:00 [kworker/0:30]
root 6741 2 0 Aug25 ? 00:00:00 [kworker/0:31]
root 6745 2 0 Aug25 ? 00:00:00 [kworker/9:9]
root 6748 2 0 Aug25 ? 00:00:00 [kworker/0:36]
root 6749 2 0 Aug25 ? 00:00:00 [kworker/19:3]
root 6753 2 0 Aug25 ? 00:00:00 [kworker/23:3]
root 6757 2 0 Aug25 ? 00:00:00 [kworker/13:12]
root 6758 2 0 Aug25 ? 00:00:00 [kworker/7:6]
root 6761 2 0 Aug25 ? 00:00:00 [kworker/0:41]
root 6767 2 0 Aug25 ? 00:00:00 [kworker/0:44]
root 6768 2 0 Aug25 ? 00:00:00 [kworker/5:7]
root 6769 2 0 Aug25 ? 00:00:00 [kworker/1:9]
root 6770 2 0 Aug25 ? 00:00:00 [kworker/0:45]
root 6774 2 0 Aug25 ? 00:00:00 [kworker/17:7]
root 6776 2 0 Aug25 ? 00:00:00 [kworker/3:4]
root 6779 2 0 Aug25 ? 00:00:00 [kworker/11:9]
root 6781 2 0 Aug25 ? 00:00:00 [kworker/0:50]
root 6787 2 0 Aug25 ? 00:00:00 [kworker/0:53]
root 6788 2 0 Aug25 ? 00:00:00 [kworker/25:1]
root 6793 2 0 Aug25 ? 00:00:00 [kworker/1:10]
root 6795 2 0 Aug25 ? 00:00:00 [kworker/27:4]
root 6797 2 0 Aug25 ? 00:00:00 [kworker/0:57]
root 6798 2 0 Aug25 ? 00:00:00 [kworker/0:58]
root 6806 2 0 Aug25 ? 00:00:00 [kworker/15:5]
root 6807 2 0 Aug25 ? 00:00:00 [kworker/0:62]
root 6808 2 0 Aug25 ? 00:00:00 [kworker/0:63]
root 6809 2 0 Aug25 ? 00:00:00 [kworker/17:8]
root 6814 2 0 Aug25 ? 00:00:00 [kworker/0:66]
root 6815 2 0 Aug25 ? 00:00:01 [kworker/0:67]
root 6819 2 0 Aug25 ? 00:00:00 [kworker/0:69]
root 6824 2 0 Aug25 ? 00:00:00 [kworker/25:3]
root 6895 2 0 00:20 ? 00:00:00 [kworker/u:0]
root 6900 2 0 00:26 ? 00:00:00 [kworker/u:1]
root 6904 2 0 00:57 ? 00:00:00 [flush-0:20]
root 6933 2 0 01:00 ? 00:00:00 [kworker/u:2]
root 6934 4627 0 01:02 ? 00:00:00 sshd: iris [priv]
iris 6936 6934 0 01:02 ? 00:00:00 sshd: ***@pts/1
iris 6937 6936 0 01:02 pts/1 00:00:00 -bash
root 6967 6937 0 01:02 pts/1 00:00:00 sudo su
root 6968 6967 0 01:02 pts/1 00:00:00 su
root 6969 6968 0 01:02 pts/1 00:00:00 bash
root 6976 6969 0 01:04 pts/1 00:00:00 ps -ef

# lsof|grep stream
lsof: WARNING: can't stat() xfs file system /mnt/VOL1
Output information may be incomplete.
bash 6937 iris cwd DIR 0,20 4096
2113547 /iris/home/adrian/streamRT (mbc:/iris)
sudo 6967 root cwd DIR 0,20 4096
2113547 /iris/home/adrian/streamRT (mbc:/iris)
su 6968 root cwd DIR 0,20 4096
2113547 /iris/home/adrian/streamRT (mbc:/iris)
bash 6969 root cwd DIR 0,20 4096
2113547 /iris/home/adrian/streamRT (mbc:/iris)
--
Stan
Stan Hoeppner
2014-08-26 06:25:59 UTC
Permalink
Forgot to mention:
load average: 171.00, 171.00, 170.95

And that's with all processes idle for over 30 minutes. So I'm thinking
bug. Probably has been fixed between .26 and .103 but they're stubborn
about sticking with this 3.4.26 kernel, not because they have an affinity
for it, but because of their slow process.
Post by Stan Hoeppner
Had some controller issues but believe we had those ironed out before this
recent breakage. I had reformatted both 48TB LUNs on this test box with -f
and defaults, and fired up the test app again. Throughput was fantastic
with no dropped application buffers for ~30 minutes, and IO times were 1.x
ms max, suggesting all the writes were acked by controller cache. The app
Failed to reopen file /mnt/VOL1/sg-04/str-0015/f-0000000005
I have many call traces in dmesg, most are XFS, pasted below. I found no
SCSI I/O errors in dmesg. Application is submitting writes via libaio and
using O_DIRECT, writing to preallocated files. I don't see any errors in
the storage controller log and everything seems to check out there.
Using
Post by Stan Hoeppner
noop elevator, mpt2sas, no multipathing. The application process is hung
in d state and kill -9 won't get rid of it. I can't unmount the hosed up
filesystem. Any ideas?
# ls -la /mnt/VOL1
ls: cannot access /mnt/VOL1: Input/output error
# dd if=/mnt/VOL1 of=/dev/null bs=1M count=16
dd: opening `/mnt/VOL1': Input/output error
# dd if=/dev/sdd of=/dev/null bs=1M count=16
16+0 records in
16+0 records out
16777216 bytes (17 MB) copied, 1.10989 s, 15.1 MB/s
# ls -la /mnt/
ls: cannot access /mnt/VOL1: Input/output error
total 8
drwxrwxrwt 6 root root 120 Aug 25 17:59 .
drwxr-xr-x 21 root root 4096 Jul 15 09:39 ..
drwxrwxrwt 3 root root 80 Aug 25 16:52 initramfs
drwxr-xr-x 3 root root 4096 Apr 24 04:57 scratch
drwxrwxrwx 5 root root 58 Aug 25 22:06 VOL0
d????????? ? ? ? ? ? VOL1
# umount /mnt/VOL1
umount: /mnt/VOL1: device is busy.
(In some cases useful info about processes that use
the device is found by lsof(8) or fuser(1))
Kernel 3.4.26
xfs_repair version 3.1.4
MemTotal: 264573184 kB
MemFree: 246519624 kB
Buffers: 16820 kB
Cached: 78120 kB
SwapCached: 0 kB
Active: 13130040 kB
Inactive: 75352 kB
Active(anon): 13110512 kB
Inactive(anon): 276 kB
Active(file): 19528 kB
Inactive(file): 75076 kB
Unevictable: 0 kB
Mlocked: 0 kB
SwapTotal: 0 kB
SwapFree: 0 kB
Dirty: 0 kB
Writeback: 0 kB
AnonPages: 13126968 kB
Mapped: 11848 kB
Shmem: 340 kB
Slab: 208476 kB
SReclaimable: 118240 kB
SUnreclaim: 90236 kB
KernelStack: 3976 kB
PageTables: 32840 kB
NFS_Unstable: 0 kB
Bounce: 0 kB
WritebackTmp: 0 kB
CommitLimit: 132286592 kB
Committed_AS: 25044716 kB
VmallocTotal: 34359738367 kB
VmallocUsed: 1050608 kB
VmallocChunk: 34358680400 kB
AnonHugePages: 13078528 kB
HugePages_Total: 0
HugePages_Free: 0
HugePages_Rsvd: 0
HugePages_Surp: 0
Hugepagesize: 2048 kB
DirectMap4k: 5056 kB
DirectMap2M: 2045952 kB
DirectMap1G: 266338304 kB
/dev/sdc /mnt/VOL0 xfs
rw,noatime,nodiratime,attr2,nobarrier,inode64,noquota 0 0
/dev/sdd /mnt/VOL1 xfs
rw,noatime,nodiratime,attr2,nobarrier,inode64,noquota 0 0
major minor #blocks name
8 32 46837141504 sdc
8 48 46837141504 sdd
2x hardware RAID5 LUNs - 64KB su, 768KB sw
2 controllers, one per LUN, 3GB cache each, write back, FPGA RAID engine
No LVM. LUNs directly formatted with XFS
26 Seagate SAS 3.5" 7.2K drives, 13 per RAID5, drive caches disabled
My counterpart had what I'd guess is this same problem on the full test
rig,
which has 16 of these LUNS, 208 drives total. He was also testing with
libaio.
I have not dug into the logs on that host as of yet. Its config is
identical
to this box but for the number of LUNs, drives.
[22635.102013] INFO: task kworker/7:0:45 blocked for more than 120
seconds.
[22635.102016] "echo 0 > /proc/sys/kernel/hung_task_timeout_secs" disables
this message.
[22635.102018] kworker/7:0 D ffff8840666c0b08 0 45 2
0x00000000
[22635.102021] ffff8840666e7bd0 0000000000000046 ffff883f7c02d000
ffff8840666f5180
[22635.102026] ffff8840666e7b80 0000000000000206 00000000000122c0
00000000000122c0
[22635.102030] ffff8840666e7fd8 ffff8840666e6000 00000000000122c0
ffff8840666e6000
[22635.102041] [<ffffffff814f5fd7>] schedule+0x64/0x66
[22635.102044] [<ffffffff814f66ec>] rwsem_down_failed_common+0xdb/0x10d
[22635.102047] [<ffffffff814f6731>] rwsem_down_write_failed+0x13/0x15
[22635.102051] [<ffffffff81261913>]
call_rwsem_down_write_failed+0x13/0x20
[22635.102055] [<ffffffff814f6a92>] ?
_raw_spin_unlock_irqrestore+0x30/0x3d
[22635.102058] [<ffffffff814f5458>] ? down_write+0x25/0x27
[22635.102070] [<ffffffffa01b35e4>] xfs_ilock+0x4f/0xb4 [xfs]
[22635.102077] [<ffffffffa01b5ab0>]
xfs_iomap_write_unwritten+0x1b3/0x32e
Post by Stan Hoeppner
[xfs]
[22635.102080] [<ffffffff814f6a92>] ?
_raw_spin_unlock_irqrestore+0x30/0x3d
[22635.102084] [<ffffffffa01ab3d2>] ? xfs_setfilesize+0x128/0x128 [xfs]
[22635.102088] [<ffffffff810bc602>] ? mempool_free+0x73/0x78
[22635.102093] [<ffffffffa01ab3d2>] ? xfs_setfilesize+0x128/0x128 [xfs]
[22635.102098] [<ffffffffa01ab45b>] xfs_end_io+0x89/0xb4 [xfs]
[22635.102102] [<ffffffff81047321>] process_one_work+0x204/0x327
[22635.102105] [<ffffffff8104757f>] worker_thread+0x13b/0x25a
[22635.102108] [<ffffffff81047444>] ? process_one_work+0x327/0x327
[22635.102111] [<ffffffff8104af6f>] kthread+0x89/0x91
[22635.102115] [<ffffffff814fdbd4>] kernel_thread_helper+0x4/0x10
[22635.102118] [<ffffffff8104aee6>] ? __init_kthread_worker+0x3c/0x3c
[22635.102120] [<ffffffff814fdbd0>] ? gs_change+0xb/0xb
[22635.102131] INFO: task streamRT-sa:5891 blocked for more than 120
seconds.
[22635.102132] "echo 0 > /proc/sys/kernel/hung_task_timeout_secs" disables
this message.
[22635.102134] streamRT-sa D ffff883f52850348 0 5891 5114
0x00000004
[22635.102136] ffff884063af5b28 0000000000000082 000000000000029c
ffff884066679100
[22635.102140] 0000000000000006 0000000000000001 00000000000122c0
00000000000122c0
[22635.102143] ffff884063af5fd8 ffff884063af4000 00000000000122c0
ffff884063af4000
[22635.102149] [<ffffffff814f5fd7>] schedule+0x64/0x66
[22635.102151] [<ffffffff814f66ec>] rwsem_down_failed_common+0xdb/0x10d
[22635.102154] [<ffffffff814f6731>] rwsem_down_write_failed+0x13/0x15
[22635.102156] [<ffffffff81261913>]
call_rwsem_down_write_failed+0x13/0x20
[22635.102158] [<ffffffff814f5458>] ? down_write+0x25/0x27
[22635.102164] [<ffffffffa01b35e4>] xfs_ilock+0x4f/0xb4 [xfs]
[22635.102170] [<ffffffffa01b00e5>] xfs_rw_ilock+0x2c/0x33 [xfs]
[22635.102172] [<ffffffff814f6ac6>] ? _raw_spin_unlock_irq+0x27/0x32
[22635.102178] [<ffffffffa01b0519>] xfs_file_aio_write_checks+0x41/0xfe
[xfs]
[22635.102184] [<ffffffffa01b06ff>] xfs_file_dio_aio_write+0x103/0x1fc
[xfs]
[22635.102189] [<ffffffffa01b0ac3>] xfs_file_aio_write+0x152/0x1b5 [xfs]
[22635.102195] [<ffffffffa01b0971>] ?
xfs_file_buffered_aio_write+0x179/0x179 [xfs]
[22635.102199] [<ffffffff81133694>] aio_rw_vect_retry+0x85/0x18a
[22635.102202] [<ffffffff8113360f>] ? aio_fsync+0x29/0x29
[22635.102204] [<ffffffff81134c10>] aio_run_iocb+0x7b/0x149
[22635.102207] [<ffffffff81134fe9>] io_submit_one+0x199/0x1f3
[22635.102209] [<ffffffff8113513d>] do_io_submit+0xfa/0x271
[22635.102212] [<ffffffff811352c4>] sys_io_submit+0x10/0x12
[22635.102215] [<ffffffff814fc912>] system_call_fastpath+0x16/0x1b
[22635.102217] INFO: task streamRT-sa:5895 blocked for more than 120
seconds.
[22635.102218] "echo 0 > /proc/sys/kernel/hung_task_timeout_secs" disables
this message.
[22635.102219] streamRT-sa D ffff883f537fa3c8 0 5895 5114
0x00000004
[22635.102222] ffff883f7d057b28 0000000000000082 0000000000000000
ffff884066227100
[22635.102225] ffff883f7d057ad8 0000000000000002 00000000000122c0
00000000000122c0
[22635.102228] ffff883f7d057fd8 ffff883f7d056000 00000000000122c0
ffff883f7d056000
[22635.102234] [<ffffffff814f5fd7>] schedule+0x64/0x66
[22635.102236] [<ffffffff814f66ec>] rwsem_down_failed_common+0xdb/0x10d
[22635.102239] [<ffffffff814f6731>] rwsem_down_write_failed+0x13/0x15
[22635.102241] [<ffffffff81261913>]
call_rwsem_down_write_failed+0x13/0x20
[22635.102243] [<ffffffff814f5458>] ? down_write+0x25/0x27
[22635.102249] [<ffffffffa01b35e4>] xfs_ilock+0x4f/0xb4 [xfs]
[22635.102254] [<ffffffffa01b00e5>] xfs_rw_ilock+0x2c/0x33 [xfs]
[22635.102257] [<ffffffff814f6a92>] ?
_raw_spin_unlock_irqrestore+0x30/0x3d
[22635.102262] [<ffffffffa01b0519>] xfs_file_aio_write_checks+0x41/0xfe
[xfs]
[22635.102268] [<ffffffffa01b06ff>] xfs_file_dio_aio_write+0x103/0x1fc
[xfs]
[22635.102274] [<ffffffffa01b0ac3>] xfs_file_aio_write+0x152/0x1b5 [xfs]
[22635.102279] [<ffffffffa01b0971>] ?
xfs_file_buffered_aio_write+0x179/0x179 [xfs]
[22635.102282] [<ffffffff81133694>] aio_rw_vect_retry+0x85/0x18a
[22635.102284] [<ffffffff8113360f>] ? aio_fsync+0x29/0x29
[22635.102286] [<ffffffff81134c10>] aio_run_iocb+0x7b/0x149
[22635.102288] [<ffffffff81134fe9>] io_submit_one+0x199/0x1f3
[22635.102291] [<ffffffff8113513d>] do_io_submit+0xfa/0x271
[22635.102294] [<ffffffff8103ed3f>] ? sys_rt_sigprocmask+0x69/0xa1
[22635.102297] [<ffffffff811352c4>] sys_io_submit+0x10/0x12
[22635.102299] [<ffffffff814fc912>] system_call_fastpath+0x16/0x1b
[22635.102301] INFO: task streamRT-sa:5900 blocked for more than 120
seconds.
[22635.102302] "echo 0 > /proc/sys/kernel/hung_task_timeout_secs" disables
this message.
[22635.102303] streamRT-sa D ffff883f53003b88 0 5900 5114
0x00000004
[22635.102305] ffff883f7d109b28 0000000000000082 0000000000000000
ffff884066679100
[22635.102309] ffff883f7d109ad8 0000000000000002 00000000000122c0
00000000000122c0
[22635.102312] ffff883f7d109fd8 ffff883f7d108000 00000000000122c0
ffff883f7d108000
[22635.102318] [<ffffffff814f5fd7>] schedule+0x64/0x66
[22635.102320] [<ffffffff814f66ec>] rwsem_down_failed_common+0xdb/0x10d
[22635.102322] [<ffffffff814f6731>] rwsem_down_write_failed+0x13/0x15
[22635.102324] [<ffffffff81261913>]
call_rwsem_down_write_failed+0x13/0x20
[22635.102327] [<ffffffff814f5458>] ? down_write+0x25/0x27
[22635.102332] [<ffffffffa01b35e4>] xfs_ilock+0x4f/0xb4 [xfs]
[22635.102338] [<ffffffffa01b00e5>] xfs_rw_ilock+0x2c/0x33 [xfs]
[22635.102340] [<ffffffff814f6a92>] ?
_raw_spin_unlock_irqrestore+0x30/0x3d
[22635.102346] [<ffffffffa01b0519>] xfs_file_aio_write_checks+0x41/0xfe
[xfs]
[22635.102351] [<ffffffffa01b06ff>] xfs_file_dio_aio_write+0x103/0x1fc
[xfs]
[22635.102357] [<ffffffffa01b0ac3>] xfs_file_aio_write+0x152/0x1b5 [xfs]
[22635.102362] [<ffffffff8105539b>] ? resched_task+0x3e/0x75
[22635.102367] [<ffffffffa01b0971>] ?
xfs_file_buffered_aio_write+0x179/0x179 [xfs]
[22635.102369] [<ffffffff81133694>] aio_rw_vect_retry+0x85/0x18a
[22635.102372] [<ffffffff8113360f>] ? aio_fsync+0x29/0x29
[22635.102374] [<ffffffff81134c10>] aio_run_iocb+0x7b/0x149
[22635.102376] [<ffffffff81134fe9>] io_submit_one+0x199/0x1f3
[22635.102379] [<ffffffff8113513d>] do_io_submit+0xfa/0x271
[22635.102382] [<ffffffff811352c4>] sys_io_submit+0x10/0x12
[22635.102384] [<ffffffff814fc912>] system_call_fastpath+0x16/0x1b
[22635.102386] INFO: task streamRT-sa:5904 blocked for more than 120
seconds.
[22635.102387] "echo 0 > /proc/sys/kernel/hung_task_timeout_secs" disables
this message.
[22635.102388] streamRT-sa D ffff88405fd50a48 0 5904 5114
0x00000004
[22635.102390] ffff883f51c2bb28 0000000000000082 0000000000000000
ffff8840662d0080
[22635.102394] ffff883f51c2bad8 0000000000000002 00000000000122c0
00000000000122c0
[22635.102397] ffff883f51c2bfd8 ffff883f51c2a000 00000000000122c0
ffff883f51c2a000
[22635.102402] [<ffffffff814f5fd7>] schedule+0x64/0x66
[22635.102405] [<ffffffff814f66ec>] rwsem_down_failed_common+0xdb/0x10d
[22635.102407] [<ffffffff814f6731>] rwsem_down_write_failed+0x13/0x15
[22635.102409] [<ffffffff81261913>]
call_rwsem_down_write_failed+0x13/0x20
[22635.102412] [<ffffffff814f5458>] ? down_write+0x25/0x27
[22635.102417] [<ffffffffa01b35e4>] xfs_ilock+0x4f/0xb4 [xfs]
[22635.102423] [<ffffffffa01b00e5>] xfs_rw_ilock+0x2c/0x33 [xfs]
[22635.102425] [<ffffffff814f6a92>] ?
_raw_spin_unlock_irqrestore+0x30/0x3d
[22635.102430] [<ffffffffa01b0519>] xfs_file_aio_write_checks+0x41/0xfe
[xfs]
[22635.102436] [<ffffffffa01b06ff>] xfs_file_dio_aio_write+0x103/0x1fc
[xfs]
[22635.102442] [<ffffffffa01b0ac3>] xfs_file_aio_write+0x152/0x1b5 [xfs]
[22635.102447] [<ffffffffa01b0971>] ?
xfs_file_buffered_aio_write+0x179/0x179 [xfs]
[22635.102450] [<ffffffff81133694>] aio_rw_vect_retry+0x85/0x18a
[22635.102452] [<ffffffff8113360f>] ? aio_fsync+0x29/0x29
[22635.102454] [<ffffffff81134c10>] aio_run_iocb+0x7b/0x149
[22635.102457] [<ffffffff81134fe9>] io_submit_one+0x199/0x1f3
[22635.102459] [<ffffffff8113513d>] do_io_submit+0xfa/0x271
[22635.102461] [<ffffffff8103ed3f>] ? sys_rt_sigprocmask+0x69/0xa1
[22635.102464] [<ffffffff811352c4>] sys_io_submit+0x10/0x12
[22635.102466] [<ffffffff814fc912>] system_call_fastpath+0x16/0x1b
[22635.102468] INFO: task streamRT-sa:5906 blocked for more than 120
seconds.
[22635.102469] "echo 0 > /proc/sys/kernel/hung_task_timeout_secs" disables
this message.
[22635.102470] streamRT-sa D ffff88405fe2ca88 0 5906 5114
0x00000004
[22635.102473] ffff883f51c2fb28 0000000000000082 0000000000000000
ffff8840667f00c0
[22635.102476] 0000000000000000 0000000000000000 00000000000122c0
00000000000122c0
[22635.102479] ffff883f51c2ffd8 ffff883f51c2e000 00000000000122c0
ffff883f51c2e000
[22635.102485] [<ffffffff814f5fd7>] schedule+0x64/0x66
[22635.102487] [<ffffffff814f66ec>] rwsem_down_failed_common+0xdb/0x10d
[22635.102489] [<ffffffff814f6731>] rwsem_down_write_failed+0x13/0x15
[22635.102491] [<ffffffff81261913>]
call_rwsem_down_write_failed+0x13/0x20
[22635.102494] [<ffffffff814f5458>] ? down_write+0x25/0x27
[22635.102499] [<ffffffffa01b35e4>] xfs_ilock+0x4f/0xb4 [xfs]
[22635.102505] [<ffffffffa01b00e5>] xfs_rw_ilock+0x2c/0x33 [xfs]
[22635.102507] [<ffffffff814f6ac6>] ? _raw_spin_unlock_irq+0x27/0x32
[22635.102512] [<ffffffffa01b0519>] xfs_file_aio_write_checks+0x41/0xfe
[xfs]
[22635.102518] [<ffffffffa01b06ff>] xfs_file_dio_aio_write+0x103/0x1fc
[xfs]
[22635.102524] [<ffffffffa01b0ac3>] xfs_file_aio_write+0x152/0x1b5 [xfs]
[22635.102529] [<ffffffffa01b0971>] ?
xfs_file_buffered_aio_write+0x179/0x179 [xfs]
[22635.102532] [<ffffffff81133694>] aio_rw_vect_retry+0x85/0x18a
[22635.102534] [<ffffffff8113360f>] ? aio_fsync+0x29/0x29
[22635.102536] [<ffffffff81134c10>] aio_run_iocb+0x7b/0x149
[22635.102538] [<ffffffff81134fe9>] io_submit_one+0x199/0x1f3
[22635.102541] [<ffffffff8113513d>] do_io_submit+0xfa/0x271
[22635.102543] [<ffffffff8103ed3f>] ? sys_rt_sigprocmask+0x69/0xa1
[22635.102545] [<ffffffff811352c4>] sys_io_submit+0x10/0x12
[22635.102548] [<ffffffff814fc912>] system_call_fastpath+0x16/0x1b
[22635.102550] INFO: task streamRT-sa:5908 blocked for more than 120
seconds.
[22635.102551] "echo 0 > /proc/sys/kernel/hung_task_timeout_secs" disables
this message.
[22635.102552] streamRT-sa D ffff88405fe2fac8 0 5908 5114
0x00000004
[22635.102554] ffff883f518b3b28 0000000000000082 000000000000029c
ffff884066770040
[22635.102558] 0000000000000006 0000000000000001 00000000000122c0
00000000000122c0
[22635.102561] ffff883f518b3fd8 ffff883f518b2000 00000000000122c0
ffff883f518b2000
[22635.102566] [<ffffffff814f5fd7>] schedule+0x64/0x66
[22635.102569] [<ffffffff814f66ec>] rwsem_down_failed_common+0xdb/0x10d
[22635.102571] [<ffffffff814f6731>] rwsem_down_write_failed+0x13/0x15
[22635.102573] [<ffffffff81261913>]
call_rwsem_down_write_failed+0x13/0x20
[22635.102575] [<ffffffff814f5458>] ? down_write+0x25/0x27
[22635.102581] [<ffffffffa01b35e4>] xfs_ilock+0x4f/0xb4 [xfs]
[22635.102586] [<ffffffffa01b00e5>] xfs_rw_ilock+0x2c/0x33 [xfs]
[22635.102589] [<ffffffff814f6ac6>] ? _raw_spin_unlock_irq+0x27/0x32
[22635.102594] [<ffffffffa01b0519>] xfs_file_aio_write_checks+0x41/0xfe
[xfs]
[22635.102600] [<ffffffffa01b06ff>] xfs_file_dio_aio_write+0x103/0x1fc
[xfs]
[22635.102620] [<ffffffffa01b0ac3>] xfs_file_aio_write+0x152/0x1b5 [xfs]
[22635.102627] [<ffffffffa01b0971>] ?
xfs_file_buffered_aio_write+0x179/0x179 [xfs]
[22635.102630] [<ffffffff81133694>] aio_rw_vect_retry+0x85/0x18a
[22635.102637] [<ffffffff8113360f>] ? aio_fsync+0x29/0x29
[22635.102641] [<ffffffff81134c10>] aio_run_iocb+0x7b/0x149
[22635.102651] [<ffffffff81134fe9>] io_submit_one+0x199/0x1f3
[22635.102655] [<ffffffff8113513d>] do_io_submit+0xfa/0x271
[22635.102661] [<ffffffff811352c4>] sys_io_submit+0x10/0x12
[22635.102664] [<ffffffff814fc912>] system_call_fastpath+0x16/0x1b
[22635.102669] INFO: task streamRT-sa:5909 blocked for more than 120
seconds.
[22635.102675] "echo 0 > /proc/sys/kernel/hung_task_timeout_secs" disables
this message.
[22635.102677] streamRT-sa D ffff88405fe2f3c8 0 5909 5114
0x00000004
[22635.102683] ffff883f518b5b28 0000000000000082 00000000000000de
ffff88406663a040
[22635.102694] 0000000000000002 0000000000000001 00000000000122c0
00000000000122c0
[22635.102705] ffff883f518b5fd8 ffff883f518b4000 00000000000122c0
ffff883f518b4000
[22635.102723] [<ffffffff814f5fd7>] schedule+0x64/0x66
[22635.102729] [<ffffffff814f66ec>] rwsem_down_failed_common+0xdb/0x10d
[22635.102734] [<ffffffff814f6731>] rwsem_down_write_failed+0x13/0x15
[22635.102739] [<ffffffff81261913>]
call_rwsem_down_write_failed+0x13/0x20
[22635.102742] [<ffffffff814f5458>] ? down_write+0x25/0x27
[22635.102751] [<ffffffffa01b35e4>] xfs_ilock+0x4f/0xb4 [xfs]
[22635.102761] [<ffffffffa01b00e5>] xfs_rw_ilock+0x2c/0x33 [xfs]
[22635.102765] [<ffffffff814f6ac6>] ? _raw_spin_unlock_irq+0x27/0x32
[22635.102773] [<ffffffffa01b0519>] xfs_file_aio_write_checks+0x41/0xfe
[xfs]
[22635.102781] [<ffffffffa01b06ff>] xfs_file_dio_aio_write+0x103/0x1fc
[xfs]
[22635.102788] [<ffffffffa01b0ac3>] xfs_file_aio_write+0x152/0x1b5 [xfs]
[22635.102798] [<ffffffffa01b0971>] ?
xfs_file_buffered_aio_write+0x179/0x179 [xfs]
[22635.102801] [<ffffffff81133694>] aio_rw_vect_retry+0x85/0x18a
[22635.102805] [<ffffffff8113360f>] ? aio_fsync+0x29/0x29
[22635.102809] [<ffffffff81134c10>] aio_run_iocb+0x7b/0x149
[22635.102812] [<ffffffff81134fe9>] io_submit_one+0x199/0x1f3
[22635.102815] [<ffffffff8113513d>] do_io_submit+0xfa/0x271
[22635.102818] [<ffffffff8103ed3f>] ? sys_rt_sigprocmask+0x69/0xa1
[22635.102824] [<ffffffff811352c4>] sys_io_submit+0x10/0x12
[22635.102827] [<ffffffff814fc912>] system_call_fastpath+0x16/0x1b
[22635.102834] INFO: task streamRT-sa:5911 blocked for more than 120
seconds.
[22635.102836] "echo 0 > /proc/sys/kernel/hung_task_timeout_secs" disables
this message.
[22635.102838] streamRT-sa D ffff883f5372c408 0 5911 5114
0x00000004
[22635.102844] ffff883f5208bb28 0000000000000082 0000000000000000
ffff884066001180
[22635.102850] 0000000000000000 0000000000000000 00000000000122c0
00000000000122c0
[22635.102861] ffff883f5208bfd8 ffff883f5208a000 00000000000122c0
ffff883f5208a000
[22635.102883] [<ffffffff814f5fd7>] schedule+0x64/0x66
[22635.102886] [<ffffffff814f66ec>] rwsem_down_failed_common+0xdb/0x10d
[22635.102890] [<ffffffff814f6731>] rwsem_down_write_failed+0x13/0x15
[22635.102895] [<ffffffff81261913>]
call_rwsem_down_write_failed+0x13/0x20
[22635.102902] [<ffffffff814f5458>] ? down_write+0x25/0x27
[22635.102911] [<ffffffffa01b35e4>] xfs_ilock+0x4f/0xb4 [xfs]
[22635.102918] [<ffffffffa01b00e5>] xfs_rw_ilock+0x2c/0x33 [xfs]
[22635.102924] [<ffffffff814f6ac6>] ? _raw_spin_unlock_irq+0x27/0x32
[22635.102931] [<ffffffffa01b0519>] xfs_file_aio_write_checks+0x41/0xfe
[xfs]
[22635.102939] [<ffffffffa01b06ff>] xfs_file_dio_aio_write+0x103/0x1fc
[xfs]
[22635.102947] [<ffffffffa01b0ac3>] xfs_file_aio_write+0x152/0x1b5 [xfs]
[22635.102954] [<ffffffffa01b0971>] ?
xfs_file_buffered_aio_write+0x179/0x179 [xfs]
[22635.102959] [<ffffffff81133694>] aio_rw_vect_retry+0x85/0x18a
[22635.102962] [<ffffffff8113360f>] ? aio_fsync+0x29/0x29
[22635.102968] [<ffffffff81134c10>] aio_run_iocb+0x7b/0x149
[22635.102973] [<ffffffff81134fe9>] io_submit_one+0x199/0x1f3
[22635.102976] [<ffffffff8113513d>] do_io_submit+0xfa/0x271
[22635.102980] [<ffffffff8103ed3f>] ? sys_rt_sigprocmask+0x69/0xa1
[22635.102985] [<ffffffff811352c4>] sys_io_submit+0x10/0x12
[22635.102989] [<ffffffff814fc912>] system_call_fastpath+0x16/0x1b
[22635.102995] INFO: task streamRT-sa:5918 blocked for more than 120
seconds.
[22635.102997] "echo 0 > /proc/sys/kernel/hung_task_timeout_secs" disables
this message.
[22635.103001] streamRT-sa D ffff883f53733a48 0 5918 5114
0x00000004
[22635.103007] ffff884063d99b28 0000000000000082 0000000000000000
ffff88406668b1c0
[22635.103017] 0000000000000000 0000000000000000 00000000000122c0
00000000000122c0
[22635.103029] ffff884063d99fd8 ffff884063d98000 00000000000122c0
ffff884063d98000
[22635.103042] [<ffffffff814f5fd7>] schedule+0x64/0x66
[22635.103047] [<ffffffff814f66ec>] rwsem_down_failed_common+0xdb/0x10d
[22635.103050] [<ffffffff814f6731>] rwsem_down_write_failed+0x13/0x15
[22635.103054] [<ffffffff81261913>]
call_rwsem_down_write_failed+0x13/0x20
[22635.103058] [<ffffffff814f5458>] ? down_write+0x25/0x27
[22635.103065] [<ffffffffa01b35e4>] xfs_ilock+0x4f/0xb4 [xfs]
[22635.103072] [<ffffffffa01b00e5>] xfs_rw_ilock+0x2c/0x33 [xfs]
[22635.103076] [<ffffffff814f6ac6>] ? _raw_spin_unlock_irq+0x27/0x32
[22635.103086] [<ffffffffa01b0519>] xfs_file_aio_write_checks+0x41/0xfe
[xfs]
[22635.103092] [<ffffffffa01b06ff>] xfs_file_dio_aio_write+0x103/0x1fc
[xfs]
[22635.103099] [<ffffffffa01b0ac3>] xfs_file_aio_write+0x152/0x1b5 [xfs]
[22635.103109] [<ffffffffa01b0971>] ?
xfs_file_buffered_aio_write+0x179/0x179 [xfs]
[22635.103113] [<ffffffff81133694>] aio_rw_vect_retry+0x85/0x18a
[22635.103120] [<ffffffff8113360f>] ? aio_fsync+0x29/0x29
[22635.103126] [<ffffffff81134c10>] aio_run_iocb+0x7b/0x149
[22635.103129] [<ffffffff81134fe9>] io_submit_one+0x199/0x1f3
[22635.103134] [<ffffffff8113513d>] do_io_submit+0xfa/0x271
[22635.103139] [<ffffffff8103ed3f>] ? sys_rt_sigprocmask+0x69/0xa1
[22635.103142] [<ffffffff811352c4>] sys_io_submit+0x10/0x12
[22635.103146] [<ffffffff814fc912>] system_call_fastpath+0x16/0x1b
# ps -ef
UID PID PPID C STIME TTY TIME CMD
root 1 0 0 Aug25 ? 00:00:05 init [3]
root 2 0 0 Aug25 ? 00:00:00 [kthreadd]
root 3 2 0 Aug25 ? 00:00:00 [ksoftirqd/0]
root 6 2 0 Aug25 ? 00:00:00 [migration/0]
root 7 2 0 Aug25 ? 00:00:00 [rcuc/0]
root 8 2 0 Aug25 ? 00:00:00 [rcun/0]
root 9 2 0 Aug25 ? 00:00:00 [rcub/0]
root 10 2 0 Aug25 ? 00:00:00 [rcun/1]
root 11 2 0 Aug25 ? 00:00:00 [rcub/1]
root 12 2 0 Aug25 ? 00:00:00 [watchdog/0]
root 13 2 0 Aug25 ? 00:00:00 [migration/1]
root 15 2 0 Aug25 ? 00:00:00 [rcuc/1]
root 16 2 0 Aug25 ? 00:00:00 [ksoftirqd/1]
root 18 2 0 Aug25 ? 00:00:00 [watchdog/1]
root 19 2 0 Aug25 ? 00:00:00 [migration/2]
root 20 2 0 Aug25 ? 00:00:00 [kworker/2:0]
root 21 2 0 Aug25 ? 00:00:00 [rcuc/2]
root 22 2 0 Aug25 ? 00:00:00 [ksoftirqd/2]
root 23 2 0 Aug25 ? 00:00:00 [watchdog/2]
root 24 2 0 Aug25 ? 00:00:00 [migration/3]
root 26 2 0 Aug25 ? 00:00:00 [rcuc/3]
root 27 2 0 Aug25 ? 00:00:00 [ksoftirqd/3]
root 28 2 0 Aug25 ? 00:00:00 [watchdog/3]
root 29 2 0 Aug25 ? 00:00:00 [migration/4]
root 30 2 0 Aug25 ? 00:00:00 [kworker/4:0]
root 31 2 0 Aug25 ? 00:00:00 [rcuc/4]
root 32 2 0 Aug25 ? 00:00:00 [ksoftirqd/4]
root 33 2 0 Aug25 ? 00:00:00 [watchdog/4]
root 34 2 0 Aug25 ? 00:00:00 [migration/5]
root 36 2 0 Aug25 ? 00:00:00 [rcuc/5]
root 37 2 0 Aug25 ? 00:00:00 [ksoftirqd/5]
root 38 2 0 Aug25 ? 00:00:00 [watchdog/5]
root 39 2 0 Aug25 ? 00:00:00 [migration/6]
root 40 2 0 Aug25 ? 00:00:00 [kworker/6:0]
root 41 2 0 Aug25 ? 00:00:00 [rcuc/6]
root 42 2 0 Aug25 ? 00:00:00 [ksoftirqd/6]
root 43 2 0 Aug25 ? 00:00:00 [watchdog/6]
root 44 2 0 Aug25 ? 00:00:00 [migration/7]
root 45 2 0 Aug25 ? 00:00:00 [kworker/7:0]
root 46 2 0 Aug25 ? 00:00:00 [rcuc/7]
root 47 2 0 Aug25 ? 00:00:00 [ksoftirqd/7]
root 48 2 0 Aug25 ? 00:00:00 [watchdog/7]
root 49 2 0 Aug25 ? 00:00:00 [migration/8]
root 50 2 0 Aug25 ? 00:00:00 [kworker/8:0]
root 51 2 0 Aug25 ? 00:00:00 [rcuc/8]
root 52 2 0 Aug25 ? 00:00:00 [ksoftirqd/8]
root 53 2 0 Aug25 ? 00:00:00 [watchdog/8]
root 54 2 0 Aug25 ? 00:00:00 [migration/9]
root 56 2 0 Aug25 ? 00:00:00 [rcuc/9]
root 57 2 0 Aug25 ? 00:00:00 [ksoftirqd/9]
root 58 2 0 Aug25 ? 00:00:00 [watchdog/9]
root 59 2 0 Aug25 ? 00:00:00 [migration/10]
root 60 2 0 Aug25 ? 00:00:00 [kworker/10:0]
root 61 2 0 Aug25 ? 00:00:00 [rcuc/10]
root 62 2 0 Aug25 ? 00:00:00 [ksoftirqd/10]
root 63 2 0 Aug25 ? 00:00:00 [watchdog/10]
root 64 2 0 Aug25 ? 00:00:00 [migration/11]
root 66 2 0 Aug25 ? 00:00:00 [rcuc/11]
root 67 2 0 Aug25 ? 00:00:00 [ksoftirqd/11]
root 68 2 0 Aug25 ? 00:00:00 [watchdog/11]
root 69 2 0 Aug25 ? 00:00:00 [migration/12]
root 70 2 0 Aug25 ? 00:00:00 [kworker/12:0]
root 71 2 0 Aug25 ? 00:00:00 [rcuc/12]
root 72 2 0 Aug25 ? 00:00:00 [ksoftirqd/12]
root 73 2 0 Aug25 ? 00:00:00 [watchdog/12]
root 74 2 0 Aug25 ? 00:00:00 [migration/13]
root 76 2 0 Aug25 ? 00:00:00 [rcuc/13]
root 77 2 0 Aug25 ? 00:00:00 [ksoftirqd/13]
root 78 2 0 Aug25 ? 00:00:00 [watchdog/13]
root 79 2 0 Aug25 ? 00:00:00 [migration/14]
root 80 2 0 Aug25 ? 00:00:00 [kworker/14:0]
root 81 2 0 Aug25 ? 00:00:00 [rcuc/14]
root 82 2 0 Aug25 ? 00:00:00 [ksoftirqd/14]
root 83 2 0 Aug25 ? 00:00:00 [watchdog/14]
root 84 2 0 Aug25 ? 00:00:00 [migration/15]
root 86 2 0 Aug25 ? 00:00:00 [rcuc/15]
root 87 2 0 Aug25 ? 00:00:00 [ksoftirqd/15]
root 88 2 0 Aug25 ? 00:00:00 [watchdog/15]
root 89 2 0 Aug25 ? 00:00:00 [migration/16]
root 90 2 0 Aug25 ? 00:00:00 [kworker/16:0]
root 91 2 0 Aug25 ? 00:00:00 [rcuc/16]
root 92 2 0 Aug25 ? 00:00:00 [rcun/2]
root 93 2 0 Aug25 ? 00:00:00 [rcub/2]
root 94 2 0 Aug25 ? 00:00:00 [ksoftirqd/16]
root 95 2 0 Aug25 ? 00:00:00 [watchdog/16]
root 96 2 0 Aug25 ? 00:00:00 [migration/17]
root 98 2 0 Aug25 ? 00:00:00 [rcuc/17]
root 99 2 0 Aug25 ? 00:00:00 [ksoftirqd/17]
root 100 2 0 Aug25 ? 00:00:00 [watchdog/17]
root 101 2 0 Aug25 ? 00:00:00 [migration/18]
root 102 2 0 Aug25 ? 00:00:00 [kworker/18:0]
root 103 2 0 Aug25 ? 00:00:00 [rcuc/18]
root 104 2 0 Aug25 ? 00:00:00 [ksoftirqd/18]
root 105 2 0 Aug25 ? 00:00:00 [watchdog/18]
root 106 2 0 Aug25 ? 00:00:00 [migration/19]
root 107 2 0 Aug25 ? 00:00:00 [kworker/19:0]
root 108 2 0 Aug25 ? 00:00:00 [rcuc/19]
root 109 2 0 Aug25 ? 00:00:00 [ksoftirqd/19]
root 110 2 0 Aug25 ? 00:00:00 [watchdog/19]
root 111 2 0 Aug25 ? 00:00:00 [migration/20]
root 112 2 0 Aug25 ? 00:00:00 [kworker/20:0]
root 113 2 0 Aug25 ? 00:00:00 [rcuc/20]
root 114 2 0 Aug25 ? 00:00:00 [ksoftirqd/20]
root 115 2 0 Aug25 ? 00:00:00 [watchdog/20]
root 116 2 0 Aug25 ? 00:00:00 [migration/21]
root 117 2 0 Aug25 ? 00:00:00 [kworker/21:0]
root 118 2 0 Aug25 ? 00:00:00 [rcuc/21]
root 119 2 0 Aug25 ? 00:00:00 [ksoftirqd/21]
root 120 2 0 Aug25 ? 00:00:00 [watchdog/21]
root 121 2 0 Aug25 ? 00:00:00 [migration/22]
root 122 2 0 Aug25 ? 00:00:00 [kworker/22:0]
root 123 2 0 Aug25 ? 00:00:00 [rcuc/22]
root 124 2 0 Aug25 ? 00:00:00 [ksoftirqd/22]
root 125 2 0 Aug25 ? 00:00:00 [watchdog/22]
root 126 2 0 Aug25 ? 00:00:00 [migration/23]
root 128 2 0 Aug25 ? 00:00:00 [rcuc/23]
root 129 2 0 Aug25 ? 00:00:00 [ksoftirqd/23]
root 130 2 0 Aug25 ? 00:00:00 [watchdog/23]
root 131 2 0 Aug25 ? 00:00:00 [migration/24]
root 132 2 0 Aug25 ? 00:00:00 [kworker/24:0]
root 133 2 0 Aug25 ? 00:00:00 [rcuc/24]
root 134 2 0 Aug25 ? 00:00:00 [ksoftirqd/24]
root 135 2 0 Aug25 ? 00:00:00 [watchdog/24]
root 136 2 0 Aug25 ? 00:00:00 [migration/25]
root 138 2 0 Aug25 ? 00:00:00 [rcuc/25]
root 139 2 0 Aug25 ? 00:00:00 [ksoftirqd/25]
root 140 2 0 Aug25 ? 00:00:00 [watchdog/25]
root 141 2 0 Aug25 ? 00:00:00 [migration/26]
root 142 2 0 Aug25 ? 00:00:00 [kworker/26:0]
root 143 2 0 Aug25 ? 00:00:00 [rcuc/26]
root 144 2 0 Aug25 ? 00:00:00 [ksoftirqd/26]
root 145 2 0 Aug25 ? 00:00:00 [watchdog/26]
root 146 2 0 Aug25 ? 00:00:00 [migration/27]
root 147 2 0 Aug25 ? 00:00:00 [kworker/27:0]
root 148 2 0 Aug25 ? 00:00:00 [rcuc/27]
root 149 2 0 Aug25 ? 00:00:00 [ksoftirqd/27]
root 150 2 0 Aug25 ? 00:00:00 [watchdog/27]
root 151 2 0 Aug25 ? 00:00:00 [migration/28]
root 152 2 0 Aug25 ? 00:00:00 [kworker/28:0]
root 153 2 0 Aug25 ? 00:00:00 [rcuc/28]
root 154 2 0 Aug25 ? 00:00:00 [ksoftirqd/28]
root 155 2 0 Aug25 ? 00:00:00 [watchdog/28]
root 156 2 0 Aug25 ? 00:00:00 [migration/29]
root 157 2 0 Aug25 ? 00:00:00 [kworker/29:0]
root 158 2 0 Aug25 ? 00:00:00 [rcuc/29]
root 159 2 0 Aug25 ? 00:00:00 [ksoftirqd/29]
root 160 2 0 Aug25 ? 00:00:00 [watchdog/29]
root 161 2 0 Aug25 ? 00:00:00 [migration/30]
root 162 2 0 Aug25 ? 00:00:00 [kworker/30:0]
root 163 2 0 Aug25 ? 00:00:00 [rcuc/30]
root 164 2 0 Aug25 ? 00:00:00 [ksoftirqd/30]
root 165 2 0 Aug25 ? 00:00:00 [watchdog/30]
root 166 2 0 Aug25 ? 00:00:00 [migration/31]
root 167 2 0 Aug25 ? 00:00:00 [kworker/31:0]
root 168 2 0 Aug25 ? 00:00:00 [rcuc/31]
root 169 2 0 Aug25 ? 00:00:00 [ksoftirqd/31]
root 170 2 0 Aug25 ? 00:00:00 [watchdog/31]
root 171 2 0 Aug25 ? 00:00:00 [cpuset]
root 172 2 0 Aug25 ? 00:00:00 [khelper]
root 173 2 0 Aug25 ? 00:00:00 [netns]
root 418 2 0 Aug25 ? 00:00:00 [sync_supers]
root 420 2 0 Aug25 ? 00:00:00 [bdi-default]
root 422 2 0 Aug25 ? 00:00:00 [kblockd]
root 714 2 0 Aug25 ? 00:00:00 [ata_sff]
root 724 2 0 Aug25 ? 00:00:00 [khubd]
root 733 2 0 Aug25 ? 00:00:00 [kworker/30:1]
root 735 2 0 Aug25 ? 00:00:00 [kworker/28:1]
root 737 2 0 Aug25 ? 00:00:00 [kworker/26:1]
root 739 2 0 Aug25 ? 00:00:00 [kworker/24:1]
root 741 2 0 Aug25 ? 00:00:00 [kworker/22:1]
root 743 2 0 Aug25 ? 00:00:00 [kworker/20:1]
root 745 2 0 Aug25 ? 00:00:00 [kworker/18:1]
root 747 2 0 Aug25 ? 00:00:00 [kworker/16:1]
root 749 2 0 Aug25 ? 00:00:00 [kworker/14:1]
root 751 2 0 Aug25 ? 00:00:00 [kworker/12:1]
root 757 2 0 Aug25 ? 00:00:00 [kworker/10:1]
root 759 2 0 Aug25 ? 00:00:00 [kworker/8:1]
root 761 2 0 Aug25 ? 00:00:00 [kworker/6:1]
root 763 2 0 Aug25 ? 00:00:00 [kworker/4:1]
root 866 2 0 Aug25 ? 00:00:00 [rpciod]
root 868 2 0 Aug25 ? 00:00:00 [kworker/2:1]
root 975 2 0 Aug25 ? 00:00:00 [khungtaskd]
root 981 2 0 Aug25 ? 00:00:00 [kswapd0]
root 982 2 0 Aug25 ? 00:00:00 [khugepaged]
root 983 2 0 Aug25 ? 00:00:00 [fsnotify_mark]
root 984 2 0 Aug25 ? 00:00:00 [nfsiod]
root 986 2 0 Aug25 ? 00:00:00 [crypto]
root 1151 2 0 Aug25 ? 00:00:00 [kpsmoused]
root 1167 2 0 Aug25 ? 00:00:00 [deferwq]
root 1913 2 0 Aug25 ? 00:00:00 [bond0]
root 2144 1 0 Aug25 ? 00:00:00 udevd --daemon
root 2525 2 0 Aug25 ? 00:00:00 [scsi_eh_0]
root 2537 2 0 Aug25 ? 00:00:00 [mlx4]
root 2540 2 0 Aug25 ? 00:00:00 [scsi_eh_1]
root 2541 2 0 Aug25 ? 00:00:00 [fw_event0]
root 2595 2 0 Aug25 ? 00:00:00 [scsi_eh_2]
root 2596 2 0 Aug25 ? 00:00:00 [scsi_eh_3]
root 2597 2 0 Aug25 ? 00:00:00 [scsi_eh_4]
root 2598 2 0 Aug25 ? 00:00:00 [scsi_eh_5]
root 2599 2 0 Aug25 ? 00:00:00 [scsi_eh_6]
root 2600 2 0 Aug25 ? 00:00:00 [scsi_eh_7]
root 2667 2 0 Aug25 ? 00:00:00 [mlx4_en]
root 2797 2144 0 Aug25 ? 00:00:00 udevd --daemon
root 2798 2144 0 Aug25 ? 00:00:00 udevd --daemon
root 2800 2 0 Aug25 ? 00:00:00 [mlx4_en]
root 2817 2 0 Aug25 ? 00:00:00 [mlx4_en]
root 2843 2 0 Aug25 ? 00:00:00 [poll_0_status]
root 2844 2 0 Aug25 ? 00:00:00 [scsi_eh_8]
root 2846 2 0 Aug25 ? 00:00:00 [fw_event1]
root 2860 2 0 Aug25 ? 00:00:00 [poll_1_status]
root 2861 2 0 Aug25 ? 00:00:00 [scsi_eh_9]
root 2862 2 0 Aug25 ? 00:00:00 [fw_event2]
root 2864 2 0 Aug25 ? 00:00:00 [poll_2_status]
root 2865 2 0 Aug25 ? 00:00:00 [scsi_eh_10]
root 2866 2 0 Aug25 ? 00:00:00 [fw_event3]
root 2867 2 0 Aug25 ? 00:00:00 [poll_3_status]
root 4059 2 0 Aug25 ? 00:00:00 [bond1]
root 4464 2 0 Aug25 ? 00:00:00 [kdmflush]
root 4470 2 0 Aug25 ? 00:00:00 [kdmflush]
root 4476 2 0 Aug25 ? 00:00:00 [kdmflush]
root 4482 2 0 Aug25 ? 00:00:00 [kdmflush]
root 4488 2 0 Aug25 ? 00:00:00 [kdmflush]
root 4494 2 0 Aug25 ? 00:00:00 [kdmflush]
root 4500 2 0 Aug25 ? 00:00:00 [kdmflush]
root 4506 2 0 Aug25 ? 00:00:00 [kdmflush]
root 4512 2 0 Aug25 ? 00:00:00 [kdmflush]
root 4518 2 0 Aug25 ? 00:00:00 [kdmflush]
root 4527 2 0 Aug25 ? 00:00:00 [jbd2/dm-4-8]
root 4528 2 0 Aug25 ? 00:00:00 [ext4-dio-unwrit]
root 4593 1 0 Aug25 ? 00:00:00 /usr/sbin/rsyslogd -c5
root 4627 1 0 Aug25 ? 00:00:00 /usr/sbin/sshd
root 4645 1 0 Aug25 ? 00:00:00 /usr/sbin/acpid
root 4683 1 0 Aug25 ? 00:00:00 /usr/sbin/cron
ntp 4713 1 0 Aug25 ? 00:00:00 /usr/sbin/ntpd -g -c
/etc/ntp.conf -p /var/run/ntpd.pid -u ntp:ntp
daemon 4747 1 0 Aug25 ? 00:00:00 /sbin/portmap
root 4773 1 0 Aug25 ? 00:00:00 /sbin/rpc.statd
root 4796 1 0 Aug25 ttyS0 00:00:00 /sbin/getty -L 115200
ttyS0 vt100
root 4797 1 0 Aug25 ? 00:00:22 initBC
root 4908 2 0 Aug25 ? 00:00:00 [xfsalloc]
root 4909 2 0 Aug25 ? 00:00:00 [xfs_mru_cache]
root 4910 2 0 Aug25 ? 00:00:00 [xfslogd]
root 4930 2 0 Aug25 ? 00:00:00 [xfsbufd/sdc]
root 4931 2 0 Aug25 ? 00:00:00 [xfs-data/sdc]
root 4932 2 0 Aug25 ? 00:00:00 [xfs-conv/sdc]
root 4933 2 0 Aug25 ? 00:00:01 [xfsaild/sdc]
root 4935 2 0 Aug25 ? 00:00:00 [xfsbufd/sdd]
root 4936 2 0 Aug25 ? 00:00:00 [xfs-data/sdd]
root 4937 2 0 Aug25 ? 00:00:00 [xfs-conv/sdd]
root 4938 2 0 Aug25 ? 00:00:01 [xfsaild/sdd]
root 5115 1 6 Aug25 ? 00:11:28 [streamRT-sa] <defunct>
root 6582 2 0 Aug25 ? 00:00:00 [kworker/1:2]
root 6633 2 0 Aug25 ? 00:00:00 [kworker/23:1]
root 6636 2 0 Aug25 ? 00:00:00 [kworker/15:0]
root 6640 2 0 Aug25 ? 00:00:00 [kworker/29:2]
root 6647 2 0 Aug25 ? 00:00:00 [kworker/3:1]
root 6649 2 0 Aug25 ? 00:00:00 [kworker/25:2]
root 6654 2 0 Aug25 ? 00:00:00 [kworker/1:0]
root 6658 2 0 Aug25 ? 00:00:00 [kworker/19:1]
root 6659 2 0 Aug25 ? 00:00:00 [kworker/21:1]
root 6661 2 0 Aug25 ? 00:00:00 [kworker/27:2]
root 6662 2 0 Aug25 ? 00:00:00 [kworker/15:2]
root 6663 2 0 Aug25 ? 00:00:00 [kworker/5:2]
root 6666 2 0 Aug25 ? 00:00:00 [kworker/31:2]
root 6671 2 0 Aug25 ? 00:00:00 [kworker/0:4]
root 6673 2 0 Aug25 ? 00:00:00 [kworker/7:3]
root 6677 2 0 Aug25 ? 00:00:00 [kworker/0:6]
root 6678 2 0 Aug25 ? 00:00:00 [kworker/0:7]
root 6681 2 0 Aug25 ? 00:00:00 [kworker/0:10]
root 6684 2 0 Aug25 ? 00:00:00 [kworker/13:5]
root 6690 2 0 Aug25 ? 00:00:00 [kworker/9:4]
root 6695 2 0 Aug25 ? 00:00:00 [kworker/7:5]
root 6696 2 0 Aug25 ? 00:00:00 [kworker/13:7]
root 6697 2 0 Aug25 ? 00:00:00 [kworker/9:6]
root 6702 2 0 Aug25 ? 00:00:00 [kworker/0:11]
root 6705 2 0 Aug25 ? 00:00:00 [kworker/0:14]
root 6707 2 0 Aug25 ? 00:00:00 [kworker/0:16]
root 6719 2 0 Aug25 ? 00:00:00 [kworker/0:24]
root 6721 2 0 Aug25 ? 00:00:00 [kworker/0:26]
root 6723 2 0 Aug25 ? 00:00:00 [kworker/9:8]
root 6726 2 0 Aug25 ? 00:00:00 [kworker/31:5]
root 6731 2 0 Aug25 ? 00:00:00 [kworker/1:7]
root 6734 2 0 Aug25 ? 00:00:00 [kworker/11:6]
root 6736 2 0 Aug25 ? 00:00:00 [kworker/13:10]
root 6740 2 0 Aug25 ? 00:00:00 [kworker/0:30]
root 6741 2 0 Aug25 ? 00:00:00 [kworker/0:31]
root 6745 2 0 Aug25 ? 00:00:00 [kworker/9:9]
root 6748 2 0 Aug25 ? 00:00:00 [kworker/0:36]
root 6749 2 0 Aug25 ? 00:00:00 [kworker/19:3]
root 6753 2 0 Aug25 ? 00:00:00 [kworker/23:3]
root 6757 2 0 Aug25 ? 00:00:00 [kworker/13:12]
root 6758 2 0 Aug25 ? 00:00:00 [kworker/7:6]
root 6761 2 0 Aug25 ? 00:00:00 [kworker/0:41]
root 6767 2 0 Aug25 ? 00:00:00 [kworker/0:44]
root 6768 2 0 Aug25 ? 00:00:00 [kworker/5:7]
root 6769 2 0 Aug25 ? 00:00:00 [kworker/1:9]
root 6770 2 0 Aug25 ? 00:00:00 [kworker/0:45]
root 6774 2 0 Aug25 ? 00:00:00 [kworker/17:7]
root 6776 2 0 Aug25 ? 00:00:00 [kworker/3:4]
root 6779 2 0 Aug25 ? 00:00:00 [kworker/11:9]
root 6781 2 0 Aug25 ? 00:00:00 [kworker/0:50]
root 6787 2 0 Aug25 ? 00:00:00 [kworker/0:53]
root 6788 2 0 Aug25 ? 00:00:00 [kworker/25:1]
root 6793 2 0 Aug25 ? 00:00:00 [kworker/1:10]
root 6795 2 0 Aug25 ? 00:00:00 [kworker/27:4]
root 6797 2 0 Aug25 ? 00:00:00 [kworker/0:57]
root 6798 2 0 Aug25 ? 00:00:00 [kworker/0:58]
root 6806 2 0 Aug25 ? 00:00:00 [kworker/15:5]
root 6807 2 0 Aug25 ? 00:00:00 [kworker/0:62]
root 6808 2 0 Aug25 ? 00:00:00 [kworker/0:63]
root 6809 2 0 Aug25 ? 00:00:00 [kworker/17:8]
root 6814 2 0 Aug25 ? 00:00:00 [kworker/0:66]
root 6815 2 0 Aug25 ? 00:00:01 [kworker/0:67]
root 6819 2 0 Aug25 ? 00:00:00 [kworker/0:69]
root 6824 2 0 Aug25 ? 00:00:00 [kworker/25:3]
root 6895 2 0 00:20 ? 00:00:00 [kworker/u:0]
root 6900 2 0 00:26 ? 00:00:00 [kworker/u:1]
root 6904 2 0 00:57 ? 00:00:00 [flush-0:20]
root 6933 2 0 01:00 ? 00:00:00 [kworker/u:2]
root 6934 4627 0 01:02 ? 00:00:00 sshd: iris [priv]
iris 6937 6936 0 01:02 pts/1 00:00:00 -bash
root 6967 6937 0 01:02 pts/1 00:00:00 sudo su
root 6968 6967 0 01:02 pts/1 00:00:00 su
root 6969 6968 0 01:02 pts/1 00:00:00 bash
root 6976 6969 0 01:04 pts/1 00:00:00 ps -ef
# lsof|grep stream
lsof: WARNING: can't stat() xfs file system /mnt/VOL1
Output information may be incomplete.
bash 6937 iris cwd DIR 0,20 4096
2113547 /iris/home/adrian/streamRT (mbc:/iris)
sudo 6967 root cwd DIR 0,20 4096
2113547 /iris/home/adrian/streamRT (mbc:/iris)
su 6968 root cwd DIR 0,20 4096
2113547 /iris/home/adrian/streamRT (mbc:/iris)
bash 6969 root cwd DIR 0,20 4096
2113547 /iris/home/adrian/streamRT (mbc:/iris)
--
Stan
Dave Chinner
2014-08-26 07:53:45 UTC
Permalink
Post by Stan Hoeppner
Had some controller issues but believe we had those ironed out before this
recent breakage. I had reformatted both 48TB LUNs on this test box with -f
and defaults, and fired up the test app again. Throughput was fantastic
with no dropped application buffers for ~30 minutes, and IO times were 1.x
ms max, suggesting all the writes were acked by controller cache. The app
Failed to reopen file /mnt/VOL1/sg-04/str-0015/f-0000000005
I have many call traces in dmesg, most are XFS, pasted below. I found no
SCSI I/O errors in dmesg. Application is submitting writes via libaio and
using O_DIRECT, writing to preallocated files. I don't see any errors in
the storage controller log and everything seems to check out there. Using
noop elevator, mpt2sas, no multipathing. The application process is hung
in d state and kill -9 won't get rid of it. I can't unmount the hosed up
filesystem. Any ideas?
# ls -la /mnt/VOL1
ls: cannot access /mnt/VOL1: Input/output error
# dd if=/mnt/VOL1 of=/dev/null bs=1M count=16
dd: opening `/mnt/VOL1': Input/output error
# dd if=/dev/sdd of=/dev/null bs=1M count=16
16+0 records in
16+0 records out
16777216 bytes (17 MB) copied, 1.10989 s, 15.1 MB/s
# ls -la /mnt/
ls: cannot access /mnt/VOL1: Input/output error
total 8
drwxrwxrwt 6 root root 120 Aug 25 17:59 .
drwxr-xr-x 21 root root 4096 Jul 15 09:39 ..
drwxrwxrwt 3 root root 80 Aug 25 16:52 initramfs
drwxr-xr-x 3 root root 4096 Apr 24 04:57 scratch
drwxrwxrwx 5 root root 58 Aug 25 22:06 VOL0
d????????? ? ? ? ? ? VOL1
That's an inode we failed to stat() - most likely because of the IO
error. Has the filesystem shut down? the IO error shoul dhave had
some kind of output in dmesg associated with it from XFS....
Post by Stan Hoeppner
[22635.102013] INFO: task kworker/7:0:45 blocked for more than 120
seconds.
[22635.102016] "echo 0 > /proc/sys/kernel/hung_task_timeout_secs" disables
this message.
[22635.102018] kworker/7:0 D ffff8840666c0b08 0 45 2
0x00000000
[22635.102021] ffff8840666e7bd0 0000000000000046 ffff883f7c02d000
ffff8840666f5180
[22635.102026] ffff8840666e7b80 0000000000000206 00000000000122c0
00000000000122c0
[22635.102030] ffff8840666e7fd8 ffff8840666e6000 00000000000122c0
ffff8840666e6000
[22635.102041] [<ffffffff814f5fd7>] schedule+0x64/0x66
[22635.102044] [<ffffffff814f66ec>] rwsem_down_failed_common+0xdb/0x10d
[22635.102047] [<ffffffff814f6731>] rwsem_down_write_failed+0x13/0x15
[22635.102051] [<ffffffff81261913>]
call_rwsem_down_write_failed+0x13/0x20
[22635.102055] [<ffffffff814f6a92>] ?
_raw_spin_unlock_irqrestore+0x30/0x3d
[22635.102058] [<ffffffff814f5458>] ? down_write+0x25/0x27
[22635.102070] [<ffffffffa01b35e4>] xfs_ilock+0x4f/0xb4 [xfs]
[22635.102077] [<ffffffffa01b5ab0>] xfs_iomap_write_unwritten+0x1b3/0x32e
[xfs]
[22635.102080] [<ffffffff814f6a92>] ?
_raw_spin_unlock_irqrestore+0x30/0x3d
[22635.102084] [<ffffffffa01ab3d2>] ? xfs_setfilesize+0x128/0x128 [xfs]
[22635.102088] [<ffffffff810bc602>] ? mempool_free+0x73/0x78
[22635.102093] [<ffffffffa01ab3d2>] ? xfs_setfilesize+0x128/0x128 [xfs]
[22635.102098] [<ffffffffa01ab45b>] xfs_end_io+0x89/0xb4 [xfs]
[22635.102102] [<ffffffff81047321>] process_one_work+0x204/0x327
[22635.102105] [<ffffffff8104757f>] worker_thread+0x13b/0x25a
[22635.102108] [<ffffffff81047444>] ? process_one_work+0x327/0x327
[22635.102111] [<ffffffff8104af6f>] kthread+0x89/0x91
[22635.102115] [<ffffffff814fdbd4>] kernel_thread_helper+0x4/0x10
[22635.102118] [<ffffffff8104aee6>] ? __init_kthread_worker+0x3c/0x3c
[22635.102120] [<ffffffff814fdbd0>] ? gs_change+0xb/0xb
That's stuck on a an inode lock. That reminds of a problem we had in
RHEL6 with highly concurrent direct IO. The rwsems were buggy, and
XFS was tripping over the bug and hanging just like this. If the
filesystem is not shutting down and causing that IO error due to a
shutdown state, then it's entirely possible we've got another rwsem
issue. Now I've just got to go find the RHEL 6 problem again. Ah:

91af708 rwsem: Test for no active locks in __rwsem_do_wake undo code

But that was fixed in 2.6.34 (rhel 6 was based on 2.6.32), so I
doubt that is your problem. However, it smells almost exactly the
same - it took about an hour of highly concurrent direct IO to SSDs
to trigger, and Lachlan ended up finding the right incantation of
lock debug code inside the xfs mrlock implementation to prove that
it was an rwsem bug and not an XFS locking issue.

Are you able to take crash dumps from this machine so you can dig
around inside the XFS inode and rwsem states when the system locks
up like this?

One thing in the RH bug that was suggested as a simply test to
determine if there's a problem with the rwsems is to edit the kernel
configin arch/x86/Kconfig:

Change:

config RWSEM_GENERIC_SPINLOCK
def_bool !X86_ADD

config RWSEM_XCHGADD_ALGORITHM
def_bool X86_ADD

to:

config RWSEM_GENERIC_SPINLOCK
def_bool y

config RWSEM_XCHGADD_ALGORITHM
def_bool n

I haven't cheked if this still works on a 3.4 kernel, but it will
change the rwsem implementation to the generic, spinlock based
implementation rather than the super-special, highly optimised
x86 specific implementation. If that makes the problem go away,
then we've got another rwsem bug on our hands. If it doesn't, then I
can probably get you the mrlock debug code lachlan wrote and we can
see if XFS is doing something wrong...

Cheers,

Dave.
--
Dave Chinner
***@fromorbit.com
Stan Hoeppner
2014-08-26 17:19:43 UTC
Permalink
Post by Dave Chinner
Post by Stan Hoeppner
Had some controller issues but believe we had those ironed out before this
recent breakage. I had reformatted both 48TB LUNs on this test box
with
Post by Dave Chinner
Post by Stan Hoeppner
-f
and defaults, and fired up the test app again. Throughput was fantastic
with no dropped application buffers for ~30 minutes, and IO times were 1.x
ms max, suggesting all the writes were acked by controller cache. The app
Failed to reopen file /mnt/VOL1/sg-04/str-0015/f-0000000005
I have many call traces in dmesg, most are XFS, pasted below. I found no
SCSI I/O errors in dmesg. Application is submitting writes via libaio and
using O_DIRECT, writing to preallocated files. I don't see any errors in
the storage controller log and everything seems to check out there.
Using
noop elevator, mpt2sas, no multipathing. The application process is hung
in d state and kill -9 won't get rid of it. I can't unmount the hosed up
filesystem. Any ideas?
# ls -la /mnt/VOL1
ls: cannot access /mnt/VOL1: Input/output error
# dd if=/mnt/VOL1 of=/dev/null bs=1M count=16
dd: opening `/mnt/VOL1': Input/output error
# dd if=/dev/sdd of=/dev/null bs=1M count=16
16+0 records in
16+0 records out
16777216 bytes (17 MB) copied, 1.10989 s, 15.1 MB/s
# ls -la /mnt/
ls: cannot access /mnt/VOL1: Input/output error
total 8
drwxrwxrwt 6 root root 120 Aug 25 17:59 .
drwxr-xr-x 21 root root 4096 Jul 15 09:39 ..
drwxrwxrwt 3 root root 80 Aug 25 16:52 initramfs
drwxr-xr-x 3 root root 4096 Apr 24 04:57 scratch
drwxrwxrwx 5 root root 58 Aug 25 22:06 VOL0
d????????? ? ? ? ? ? VOL1
That's an inode we failed to stat() - most likely because of the IO
error. Has the filesystem shut down? the IO error shoul dhave had
some kind of output in dmesg associated with it from XFS....
Sorry, I omitted the XFS logging preceding the first call trace:

Aug 25 18:02:50 Anguish-ssu-1 kernel: [ 4280.310304] SGI XFS with security
attributes, large block/inode numbers, no debug enabled
Aug 25 18:02:50 Anguish-ssu-1 kernel: [ 4280.311101] XFS (sdc): Mounting
Filesystem
Aug 25 18:02:51 Anguish-ssu-1 kernel: [ 4280.501405] XFS (sdc): Starting
recovery (logdev: internal)
Aug 25 18:02:53 Anguish-ssu-1 kernel: [ 4282.766547] XFS (sdc): Failed to
recover EFIs
Aug 25 18:02:53 Anguish-ssu-1 kernel: [ 4282.766549] XFS (sdc): log mount
finish failed
Aug 25 18:04:50 Anguish-ssu-1 kernel: [ 4399.471829] XFS (sdc): Mounting
Filesystem
Aug 25 18:04:50 Anguish-ssu-1 kernel: [ 4399.620805] XFS (sdc): Ending
clean mount
Aug 25 18:04:56 Anguish-ssu-1 kernel: [ 4405.613778] XFS (sdd): Mounting
Filesystem
Aug 25 18:04:56 Anguish-ssu-1 kernel: [ 4405.835412] XFS (sdd): Ending
clean mount
Aug 25 23:05:39 Anguish-ssu-1 kernel: [22409.328839] XFS (sdd):
xfs_do_force_shutdown(0x8) called from line 3732 of file fs/xfs/xfs_bmap.c.
Return address = 0xffffffffa01cc9a6
Aug 25 23:05:49 Anguish-ssu-1 kernel: [22419.307128] XFS (sdd): failed to
update timestamps for inode 0x2a000000a4
Aug 25 23:05:49 Anguish-ssu-1 kernel: [22419.307484] XFS (sdd): failed to
update timestamps for inode 0x29000000a4
Aug 25 23:05:49 Anguish-ssu-1 kernel: [22419.307487] XFS (sdd): failed to
update timestamps for inode 0x20000009f
Aug 25 23:05:49 Anguish-ssu-1 kernel: [22419.307499] XFS (sdd): failed to
update timestamps for inode 0x29000000a4
Aug 25 23:05:49 Anguish-ssu-1 kernel: [22419.307693] XFS (sdd): failed to
update timestamps for inode 0x20000009f
Aug 25 23:05:49 Anguish-ssu-1 kernel: [22419.368106] XFS (sdd): failed to
update timestamps for inode 0xf000000a4
Aug 25 23:05:49 Anguish-ssu-1 kernel: [22419.369785] XFS (sdd): failed to
update timestamps for inode 0xf000000a4
Aug 25 23:05:49 Anguish-ssu-1 kernel: [22419.605835] XFS (sdd): failed to
update timestamps for inode 0x2810f413c
Aug 25 23:05:49 Anguish-ssu-1 kernel: [22419.606169] XFS (sdd): failed to
update timestamps for inode 0x60000009f

Again, when the app locked up I was assuming we had more controller
issues, and I expected to find SCSI I/O errors preceding the XFS errors.
But there are none. The controllers check out, no errors logged on them.
Diags pass, etc. When NOT using AIO for submission, just parallel
O_DIRECT, the big rig is sustaining ~2.5GB/s write throughput across 14
arrays at the application level with ~4200 write threads submitting IOs
concurrently, with no problems, no errors. ~1/3rd of these write threads
are slow writers, each submitting a 32KB IO to a file every few seconds.
Fast and medium rate streams fill the cache, thus these slow writer streams
generate RMWs. This is why the throughput is relatively low for 168
effective spindles which are capable of streaming writes in the ~16GB/s
neighborhood. My little test rig is submitting from 602 threads in
parallel, 301 threads each to one of two filesystems each on a 12+1 RAID5
LUN. Both boxes have the same problem described here, but only when
submitting with AIO. I don't have logs for the big box, but I did see some
of the same collateral damage on it, specifically the garbage when doing an
'ls -la' on the mount points. This obviously due to the XFS forced
shutdown.
Post by Dave Chinner
Post by Stan Hoeppner
[22635.102013] INFO: task kworker/7:0:45 blocked for more than 120
seconds.
[22635.102016] "echo 0 > /proc/sys/kernel/hung_task_timeout_secs" disables
this message.
[22635.102018] kworker/7:0 D ffff8840666c0b08 0 45 2
0x00000000
[22635.102021] ffff8840666e7bd0 0000000000000046 ffff883f7c02d000
ffff8840666f5180
[22635.102026] ffff8840666e7b80 0000000000000206 00000000000122c0
00000000000122c0
[22635.102030] ffff8840666e7fd8 ffff8840666e6000 00000000000122c0
ffff8840666e6000
[22635.102041] [<ffffffff814f5fd7>] schedule+0x64/0x66
[22635.102044] [<ffffffff814f66ec>]
rwsem_down_failed_common+0xdb/0x10d
Post by Dave Chinner
Post by Stan Hoeppner
[22635.102047] [<ffffffff814f6731>] rwsem_down_write_failed+0x13/0x15
[22635.102051] [<ffffffff81261913>]
call_rwsem_down_write_failed+0x13/0x20
[22635.102055] [<ffffffff814f6a92>] ?
_raw_spin_unlock_irqrestore+0x30/0x3d
[22635.102058] [<ffffffff814f5458>] ? down_write+0x25/0x27
[22635.102070] [<ffffffffa01b35e4>] xfs_ilock+0x4f/0xb4 [xfs]
[22635.102077] [<ffffffffa01b5ab0>]
xfs_iomap_write_unwritten+0x1b3/0x32e
[xfs]
[22635.102080] [<ffffffff814f6a92>] ?
_raw_spin_unlock_irqrestore+0x30/0x3d
[22635.102084] [<ffffffffa01ab3d2>] ? xfs_setfilesize+0x128/0x128 [xfs]
[22635.102088] [<ffffffff810bc602>] ? mempool_free+0x73/0x78
[22635.102093] [<ffffffffa01ab3d2>] ? xfs_setfilesize+0x128/0x128 [xfs]
[22635.102098] [<ffffffffa01ab45b>] xfs_end_io+0x89/0xb4 [xfs]
[22635.102102] [<ffffffff81047321>] process_one_work+0x204/0x327
[22635.102105] [<ffffffff8104757f>] worker_thread+0x13b/0x25a
[22635.102108] [<ffffffff81047444>] ? process_one_work+0x327/0x327
[22635.102111] [<ffffffff8104af6f>] kthread+0x89/0x91
[22635.102115] [<ffffffff814fdbd4>] kernel_thread_helper+0x4/0x10
[22635.102118] [<ffffffff8104aee6>] ? __init_kthread_worker+0x3c/0x3c
[22635.102120] [<ffffffff814fdbd0>] ? gs_change+0xb/0xb
That's stuck on a an inode lock. That reminds of a problem we had in
RHEL6 with highly concurrent direct IO. The rwsems were buggy, and
XFS was tripping over the bug and hanging just like this. If the
filesystem is not shutting down and causing that IO error due to a
shutdown state, then it's entirely possible we've got another rwsem
Definitely highly concurrent direct io here. Filesystem *is* shutting
down. Again, this only happens when submitting via AIO.
Post by Dave Chinner
91af708 rwsem: Test for no active locks in __rwsem_do_wake undo code
But that was fixed in 2.6.34 (rhel 6 was based on 2.6.32), so I
doubt that is your problem. However, it smells almost exactly the
same - it took about an hour of highly concurrent direct IO to SSDs
to trigger, and Lachlan ended up finding the right incantation of
lock debug code inside the xfs mrlock implementation to prove that
it was an rwsem bug and not an XFS locking issue.
FWIW I'm seeing it after about ~30 minutes on the small box. I don't know
how quickly it occurred on the big rig.
Post by Dave Chinner
Are you able to take crash dumps from this machine so you can dig
around inside the XFS inode and rwsem states when the system locks
up like this?
The kernel isn't completely crashing. I am able to CTRL+Z out of the hung
application back to a prompt, and open additional SSH shells. But when
these errors occur, something in the kernel does become goofy, as the load
average spikes to the stratosphere, even though there is no CPU burn nor IO
wait. Once the app crashes the box is 100% idle. Do keep in mind as we
troubleshoot this that the boxen are in a machine room 520 miles from my
location. So I can't push magic key sequences on the console. I don't
have KVM access. Through I might be able to get it if needed, haven't
asked.
Post by Dave Chinner
One thing in the RH bug that was suggested as a simply test to
determine if there's a problem with the rwsems is to edit the kernel
config RWSEM_GENERIC_SPINLOCK
def_bool !X86_ADD
config RWSEM_XCHGADD_ALGORITHM
def_bool X86_ADD
config RWSEM_GENERIC_SPINLOCK
def_bool y
config RWSEM_XCHGADD_ALGORITHM
def_bool n
These aren't truly lab systems in all respects. I don't have the
permission/access to change the kernel config and recompile. There is a
"process" for that and a different group owns that process. If this is
necessary to troubleshoot this then I'll see if I can push it through.
Post by Dave Chinner
I haven't cheked if this still works on a 3.4 kernel, but it will
change the rwsem implementation to the generic, spinlock based
implementation rather than the super-special, highly optimised
x86 specific implementation. If that makes the problem go away,
then we've got another rwsem bug on our hands. If it doesn't, then I
can probably get you the mrlock debug code lachlan wrote and we can
see if XFS is doing something wrong...
Thanks for the detailed explanation. Given the new information I've
provided, what is my next step?

Thanks,

Stan
Dave Chinner
2014-08-28 00:32:27 UTC
Permalink
Post by Stan Hoeppner
Post by Dave Chinner
Post by Stan Hoeppner
Had some controller issues but believe we had those ironed out before this
recent breakage. I had reformatted both 48TB LUNs on this test box
with
Post by Dave Chinner
Post by Stan Hoeppner
-f
and defaults, and fired up the test app again. Throughput was
fantastic
Post by Dave Chinner
Post by Stan Hoeppner
with no dropped application buffers for ~30 minutes, and IO times were 1.x
ms max, suggesting all the writes were acked by controller cache. The app
Failed to reopen file /mnt/VOL1/sg-04/str-0015/f-0000000005
I have many call traces in dmesg, most are XFS, pasted below. I found
no
Post by Dave Chinner
Post by Stan Hoeppner
SCSI I/O errors in dmesg. Application is submitting writes via libaio and
using O_DIRECT, writing to preallocated files. I don't see any errors
in
Post by Dave Chinner
Post by Stan Hoeppner
the storage controller log and everything seems to check out there.
Using
noop elevator, mpt2sas, no multipathing. The application process is
hung
Post by Dave Chinner
Post by Stan Hoeppner
in d state and kill -9 won't get rid of it. I can't unmount the hosed
up
Post by Dave Chinner
Post by Stan Hoeppner
filesystem. Any ideas?
# ls -la /mnt/VOL1
ls: cannot access /mnt/VOL1: Input/output error
# dd if=/mnt/VOL1 of=/dev/null bs=1M count=16
dd: opening `/mnt/VOL1': Input/output error
# dd if=/dev/sdd of=/dev/null bs=1M count=16
16+0 records in
16+0 records out
16777216 bytes (17 MB) copied, 1.10989 s, 15.1 MB/s
# ls -la /mnt/
ls: cannot access /mnt/VOL1: Input/output error
total 8
drwxrwxrwt 6 root root 120 Aug 25 17:59 .
drwxr-xr-x 21 root root 4096 Jul 15 09:39 ..
drwxrwxrwt 3 root root 80 Aug 25 16:52 initramfs
drwxr-xr-x 3 root root 4096 Apr 24 04:57 scratch
drwxrwxrwx 5 root root 58 Aug 25 22:06 VOL0
d????????? ? ? ? ? ? VOL1
That's an inode we failed to stat() - most likely because of the IO
error. Has the filesystem shut down? the IO error shoul dhave had
some kind of output in dmesg associated with it from XFS....
Aug 25 18:02:50 Anguish-ssu-1 kernel: [ 4280.310304] SGI XFS with security
attributes, large block/inode numbers, no debug enabled
Aug 25 18:02:50 Anguish-ssu-1 kernel: [ 4280.311101] XFS (sdc): Mounting
Filesystem
Aug 25 18:02:51 Anguish-ssu-1 kernel: [ 4280.501405] XFS (sdc): Starting
recovery (logdev: internal)
Aug 25 18:02:53 Anguish-ssu-1 kernel: [ 4282.766547] XFS (sdc): Failed to
recover EFIs
Aug 25 18:02:53 Anguish-ssu-1 kernel: [ 4282.766549] XFS (sdc): log mount
finish failed
That's indicative of corrupted free space btrees.
Post by Stan Hoeppner
Aug 25 18:04:50 Anguish-ssu-1 kernel: [ 4399.471829] XFS (sdc): Mounting
Filesystem
Aug 25 18:04:50 Anguish-ssu-1 kernel: [ 4399.620805] XFS (sdc): Ending
clean mount
Aug 25 18:04:56 Anguish-ssu-1 kernel: [ 4405.613778] XFS (sdd): Mounting
Filesystem
Aug 25 18:04:56 Anguish-ssu-1 kernel: [ 4405.835412] XFS (sdd): Ending
clean mount
xfs_do_force_shutdown(0x8) called from line 3732 of file fs/xfs/xfs_bmap.c.
Return address = 0xffffffffa01cc9a6
Yup, that's kinda important. That's from xfs_bmap_finish(), and
freeing an extent has failed and triggered SHUTDOWN_CORRUPT_INCORE
which it's found some kind of inconsistency in the free space
btrees. So, likely the same problem that caused EFI recovery to fail
on the other volume.

Are the tests being run on newly made filesystems? If not, have
these filesystems had xfs_repair run on them after a failure? If
so, what is the error that is fixed? If not, does repairing the
filesystem make the problem go away?
Post by Stan Hoeppner
Aug 25 23:05:49 Anguish-ssu-1 kernel: [22419.307128] XFS (sdd): failed to
update timestamps for inode 0x2a000000a4
Aug 25 23:05:49 Anguish-ssu-1 kernel: [22419.307484] XFS (sdd): failed to
update timestamps for inode 0x29000000a4
Aug 25 23:05:49 Anguish-ssu-1 kernel: [22419.307487] XFS (sdd): failed to
update timestamps for inode 0x20000009f
Aug 25 23:05:49 Anguish-ssu-1 kernel: [22419.307499] XFS (sdd): failed to
update timestamps for inode 0x29000000a4
Aug 25 23:05:49 Anguish-ssu-1 kernel: [22419.307693] XFS (sdd): failed to
update timestamps for inode 0x20000009f
Aug 25 23:05:49 Anguish-ssu-1 kernel: [22419.368106] XFS (sdd): failed to
update timestamps for inode 0xf000000a4
Aug 25 23:05:49 Anguish-ssu-1 kernel: [22419.369785] XFS (sdd): failed to
update timestamps for inode 0xf000000a4
Aug 25 23:05:49 Anguish-ssu-1 kernel: [22419.605835] XFS (sdd): failed to
update timestamps for inode 0x2810f413c
Aug 25 23:05:49 Anguish-ssu-1 kernel: [22419.606169] XFS (sdd): failed to
update timestamps for inode 0x60000009f
And that is interesting. Makes me wonder if the inode is getting
unlocked on transaction commit failure, or whether there's some
other path in the shutdown code that is not unlocking the inode
correctly.
Post by Stan Hoeppner
Again, when the app locked up I was assuming we had more controller
issues, and I expected to find SCSI I/O errors preceding the XFS errors.
But there are none.
XFs has had some kind of internal error in extent or freespace
management. The lack of output when the error has triggered makes it
impossible to determine what might have gone wrong. Seeing what
xfs_repair -n says about the filesystem woul dbe interesting.
Probably best to use an xfs-repair from the current 3.2.1 release,
because older 3.1.x versions (i think 3.1.10 and prior) didn't
validate the freespace btrees - they just got rebuilt.
Post by Stan Hoeppner
These aren't truly lab systems in all respects. I don't have the
permission/access to change the kernel config and recompile. There is a
"process" for that and a different group owns that process. If this is
necessary to troubleshoot this then I'll see if I can push it through.
Good luck, it sounds like you might have a few stories for the
Daily WTF if the "processes" at that so entrenched it's impossible
to test fixes.
Post by Stan Hoeppner
Post by Dave Chinner
I haven't cheked if this still works on a 3.4 kernel, but it will
change the rwsem implementation to the generic, spinlock based
implementation rather than the super-special, highly optimised
x86 specific implementation. If that makes the problem go away,
then we've got another rwsem bug on our hands. If it doesn't, then I
can probably get you the mrlock debug code lachlan wrote and we can
see if XFS is doing something wrong...
Thanks for the detailed explanation. Given the new information I've
provided, what is my next step?
xfs_repair output, turn /proc/sys/fs/xfs/error_level up to 11,
maybe run a test with a CONFIG_XFS_DEBUG=y kernel, or even a
current 3.15/3.16 kernel to see if the problem still exists...

But if you can't get new kernels or xfsprogs binaries onto the
machine, then there is very little that we can do.

Cheers,

Dave.
--
Dave Chinner
***@fromorbit.com
Stan Hoeppner
2014-08-28 22:31:33 UTC
Permalink
Post by Dave Chinner
Post by Stan Hoeppner
Post by Dave Chinner
Post by Stan Hoeppner
Had some controller issues but believe we had those ironed out
before
Post by Dave Chinner
Post by Stan Hoeppner
Post by Dave Chinner
Post by Stan Hoeppner
this
recent breakage. I had reformatted both 48TB LUNs on this test box
with
Post by Dave Chinner
Post by Stan Hoeppner
-f
and defaults, and fired up the test app again. Throughput was
fantastic
Post by Dave Chinner
Post by Stan Hoeppner
with no dropped application buffers for ~30 minutes, and IO times
were
Post by Dave Chinner
Post by Stan Hoeppner
Post by Dave Chinner
Post by Stan Hoeppner
1.x
ms max, suggesting all the writes were acked by controller cache.
The
Post by Dave Chinner
Post by Stan Hoeppner
Post by Dave Chinner
Post by Stan Hoeppner
app
was running perfectly. Then it hanged and spit out an internal
error
Post by Dave Chinner
Post by Stan Hoeppner
Post by Dave Chinner
Post by Stan Hoeppner
Failed to reopen file /mnt/VOL1/sg-04/str-0015/f-0000000005
I have many call traces in dmesg, most are XFS, pasted below. I found
no
Post by Dave Chinner
Post by Stan Hoeppner
SCSI I/O errors in dmesg. Application is submitting writes via
libaio
Post by Dave Chinner
Post by Stan Hoeppner
Post by Dave Chinner
Post by Stan Hoeppner
and
using O_DIRECT, writing to preallocated files. I don't see any errors
in
Post by Dave Chinner
Post by Stan Hoeppner
the storage controller log and everything seems to check out there.
Using
noop elevator, mpt2sas, no multipathing. The application process is
hung
Post by Dave Chinner
Post by Stan Hoeppner
in d state and kill -9 won't get rid of it. I can't unmount the hosed
up
Post by Dave Chinner
Post by Stan Hoeppner
filesystem. Any ideas?
# ls -la /mnt/VOL1
ls: cannot access /mnt/VOL1: Input/output error
# dd if=/mnt/VOL1 of=/dev/null bs=1M count=16
dd: opening `/mnt/VOL1': Input/output error
# dd if=/dev/sdd of=/dev/null bs=1M count=16
16+0 records in
16+0 records out
16777216 bytes (17 MB) copied, 1.10989 s, 15.1 MB/s
# ls -la /mnt/
ls: cannot access /mnt/VOL1: Input/output error
total 8
drwxrwxrwt 6 root root 120 Aug 25 17:59 .
drwxr-xr-x 21 root root 4096 Jul 15 09:39 ..
drwxrwxrwt 3 root root 80 Aug 25 16:52 initramfs
drwxr-xr-x 3 root root 4096 Apr 24 04:57 scratch
drwxrwxrwx 5 root root 58 Aug 25 22:06 VOL0
d????????? ? ? ? ? ? VOL1
That's an inode we failed to stat() - most likely because of the IO
error. Has the filesystem shut down? the IO error shoul dhave had
some kind of output in dmesg associated with it from XFS....
Aug 25 18:02:50 Anguish-ssu-1 kernel: [ 4280.310304] SGI XFS with security
attributes, large block/inode numbers, no debug enabled
Aug 25 18:02:50 Anguish-ssu-1 kernel: [ 4280.311101] XFS (sdc): Mounting
Filesystem
Aug 25 18:02:51 Anguish-ssu-1 kernel: [ 4280.501405] XFS (sdc): Starting
recovery (logdev: internal)
Aug 25 18:02:53 Anguish-ssu-1 kernel: [ 4282.766547] XFS (sdc): Failed to
recover EFIs
Aug 25 18:02:53 Anguish-ssu-1 kernel: [ 4282.766549] XFS (sdc): log mount
finish failed
That's indicative of corrupted free space btrees.
I did mkfs on both LUNs after this mount failure.
Post by Dave Chinner
Post by Stan Hoeppner
Aug 25 18:04:50 Anguish-ssu-1 kernel: [ 4399.471829] XFS (sdc): Mounting
Filesystem
Aug 25 18:04:50 Anguish-ssu-1 kernel: [ 4399.620805] XFS (sdc): Ending
clean mount
Aug 25 18:04:56 Anguish-ssu-1 kernel: [ 4405.613778] XFS (sdd): Mounting
Filesystem
Aug 25 18:04:56 Anguish-ssu-1 kernel: [ 4405.835412] XFS (sdd): Ending
clean mount
The back-to-back mounts here were me testing my storage prep automation
script - creates the mount points, sets permission, mounts the filesystems,
sets the elevator to noop, etc. (their kernel defaults to CFQ...)
Post by Dave Chinner
Post by Stan Hoeppner
xfs_do_force_shutdown(0x8) called from line 3732 of file
fs/xfs/xfs_bmap.c.
Return address = 0xffffffffa01cc9a6
Yup, that's kinda important. That's from xfs_bmap_finish(), and
freeing an extent has failed and triggered SHUTDOWN_CORRUPT_INCORE
which it's found some kind of inconsistency in the free space
btrees. So, likely the same problem that caused EFI recovery to fail
on the other volume.
Are the tests being run on newly made filesystems? If not, have
these filesystems had xfs_repair run on them after a failure? If
so, what is the error that is fixed? If not, does repairing the
filesystem make the problem go away?
Newly made after every error of any kind, whether app, XFS shutdown, call
trace, etc. I've not attempted xfs_repair. Part of the problem is the
storage hardware is a moving target. They're swapping modules and
upgrading firmware every few days. And I don't have a view into that. So
it's difficult to know when IO problems are due to hardware or buggy code.
However, I can state with certainty that we only run into the XFS problems
when using AIO. And it has occurred on both test rigs, each of which have
their own RAID controllers and disks.
Post by Dave Chinner
Post by Stan Hoeppner
Aug 25 23:05:49 Anguish-ssu-1 kernel: [22419.307128] XFS (sdd): failed to
update timestamps for inode 0x2a000000a4
Aug 25 23:05:49 Anguish-ssu-1 kernel: [22419.307484] XFS (sdd): failed to
update timestamps for inode 0x29000000a4
Aug 25 23:05:49 Anguish-ssu-1 kernel: [22419.307487] XFS (sdd): failed to
update timestamps for inode 0x20000009f
Aug 25 23:05:49 Anguish-ssu-1 kernel: [22419.307499] XFS (sdd): failed to
update timestamps for inode 0x29000000a4
Aug 25 23:05:49 Anguish-ssu-1 kernel: [22419.307693] XFS (sdd): failed to
update timestamps for inode 0x20000009f
Aug 25 23:05:49 Anguish-ssu-1 kernel: [22419.368106] XFS (sdd): failed to
update timestamps for inode 0xf000000a4
Aug 25 23:05:49 Anguish-ssu-1 kernel: [22419.369785] XFS (sdd): failed to
update timestamps for inode 0xf000000a4
Aug 25 23:05:49 Anguish-ssu-1 kernel: [22419.605835] XFS (sdd): failed to
update timestamps for inode 0x2810f413c
Aug 25 23:05:49 Anguish-ssu-1 kernel: [22419.606169] XFS (sdd): failed to
update timestamps for inode 0x60000009f
And that is interesting. Makes me wonder if the inode is getting
unlocked on transaction commit failure, or whether there's some
other path in the shutdown code that is not unlocking the inode
correctly.
Is this a separate time stamp from that which noatime disables? We're
mounting with noatime, nodiratime.
Post by Dave Chinner
Post by Stan Hoeppner
Again, when the app locked up I was assuming we had more controller
issues, and I expected to find SCSI I/O errors preceding the XFS errors.
But there are none.
XFs has had some kind of internal error in extent or freespace
management. The lack of output when the error has triggered makes it
impossible to determine what might have gone wrong. Seeing what
xfs_repair -n says about the filesystem woul dbe interesting.
Probably best to use an xfs-repair from the current 3.2.1 release,
because older 3.1.x versions (i think 3.1.10 and prior) didn't
validate the freespace btrees - they just got rebuilt.
I'm hoping to be able to get kernel 3.12.26 onto the smaller rig soon so I
can test bcache. I'll enable xfs_debug on it. WRT newer user space tools,
that's like pulling teeth. It's all like like pulling teeth actually, WRT
the platform software. WRT the app we can do whatever we want.
Post by Dave Chinner
Post by Stan Hoeppner
These aren't truly lab systems in all respects. I don't have the
permission/access to change the kernel config and recompile. There is a
"process" for that and a different group owns that process. If this is
necessary to troubleshoot this then I'll see if I can push it through.
Good luck, it sounds like you might have a few stories for the
Daily WTF if the "processes" at that so entrenched it's impossible
to test fixes.
Yes, it's difficult. I won't be able to test patches as things stand.
That will require much time, many meetings, my project manager throwing
chairs, that kinda thing. And he's very mild mannered.
Post by Dave Chinner
Post by Stan Hoeppner
Post by Dave Chinner
I haven't cheked if this still works on a 3.4 kernel, but it will
change the rwsem implementation to the generic, spinlock based
implementation rather than the super-special, highly optimised
x86 specific implementation. If that makes the problem go away,
then we've got another rwsem bug on our hands. If it doesn't, then I
can probably get you the mrlock debug code lachlan wrote and we can
see if XFS is doing something wrong...
Thanks for the detailed explanation. Given the new information I've
provided, what is my next step?
xfs_repair output, turn /proc/sys/fs/xfs/error_level up to 11,
maybe run a test with a CONFIG_XFS_DEBUG=y kernel, or even a
current 3.15/3.16 kernel to see if the problem still exists...
But if you can't get new kernels or xfsprogs binaries onto the
machine, then there is very little that we can do.
One would think such a thing would be easy, a no brainer. But this is a
large corporation after all, so I shouldn't be surprised by red tape. The
fact it took them over 3 months from contact to getting me in the door was
a bit of a clue up front. I think the way this is seen is that AIO would
be nice to have, but it's not a necessity. Not with the red tape to cut
through to get newer system software...
Post by Dave Chinner
Cheers,
Dave.
--
Stan
Dave Chinner
2014-08-28 23:08:17 UTC
Permalink
Post by Stan Hoeppner
Post by Dave Chinner
Post by Stan Hoeppner
xfs_do_force_shutdown(0x8) called from line 3732 of file
fs/xfs/xfs_bmap.c.
Return address = 0xffffffffa01cc9a6
Yup, that's kinda important. That's from xfs_bmap_finish(), and
freeing an extent has failed and triggered SHUTDOWN_CORRUPT_INCORE
which it's found some kind of inconsistency in the free space
btrees. So, likely the same problem that caused EFI recovery to fail
on the other volume.
Are the tests being run on newly made filesystems? If not, have
these filesystems had xfs_repair run on them after a failure? If
so, what is the error that is fixed? If not, does repairing the
filesystem make the problem go away?
Newly made after every error of any kind, whether app, XFS shutdown, call
trace, etc. I've not attempted xfs_repair.
Please do.
Post by Stan Hoeppner
Part of the problem is the storage hardware is a moving target.
They're swapping modules and upgrading firmware every few days.
And I don't have a view into that. So it's difficult to know when
IO problems are due to hardware or buggy code. However, I can
state with certainty that we only run into the XFS problems when
using AIO. And it has occurred on both test rigs, each of which
have their own RAID controllers and disks.
Which, in and of itself doesn't point at AIO or XFS being the
problem. What is says is that something goes wrong under the
extremely high IO load that can be generated with AIO+DIO. That
"something" might be a race in XFS, a bug in AIO, or could be a
load related storage problem.

For example, I've been on the wrong end of hard to track down
problems on beta/early access storage before. For example there was
an incident years ago that took 3 more than 3 months to isolate a
filesystem corruption that occurred under high load. It took that
long to isolate a test case, reproduce it in house on identical
hardware, firmware, software, etc and then *capture it with a FC
analyser*.

The bug? The bleeding edge storage arrays being used had had a
firmware bug in it. When the number of outstanding IOs hit the
*array controller* command tag queue depth limit (some several
thousand simultaneous IOs in flight) it would occasionally misdirect
a single write IO to the *wrong lun*. i.e. it would misdirect a
write.

It was only under *extreme* loads that this would happen, and it's
this sort of load that AIO+DIO can easily generate - you can have
several thousand IOs in flight without too much hassle, and that
will hit limits in the storage arrays that aren't often hit. Array
controller CTQ depth limits are a good example of a limit that
normal IO won't go near to stressing.
Post by Stan Hoeppner
Post by Dave Chinner
Post by Stan Hoeppner
update timestamps for inode 0xf000000a4
Aug 25 23:05:49 Anguish-ssu-1 kernel: [22419.605835] XFS (sdd): failed
to
Post by Dave Chinner
Post by Stan Hoeppner
update timestamps for inode 0x2810f413c
Aug 25 23:05:49 Anguish-ssu-1 kernel: [22419.606169] XFS (sdd): failed
to
Post by Dave Chinner
Post by Stan Hoeppner
update timestamps for inode 0x60000009f
And that is interesting. Makes me wonder if the inode is getting
unlocked on transaction commit failure, or whether there's some
other path in the shutdown code that is not unlocking the inode
correctly.
Is this a separate time stamp from that which noatime disables? We're
mounting with noatime, nodiratime.
Yes. mtime/ctime updates go through this on the write path.

Cheers,

Dave.
--
Dave Chinner
***@fromorbit.com
Stan Hoeppner
2014-08-29 16:38:16 UTC
Permalink
Post by Dave Chinner
Post by Stan Hoeppner
Post by Dave Chinner
Post by Stan Hoeppner
xfs_do_force_shutdown(0x8) called from line 3732 of file
fs/xfs/xfs_bmap.c.
Return address = 0xffffffffa01cc9a6
Yup, that's kinda important. That's from xfs_bmap_finish(), and
freeing an extent has failed and triggered SHUTDOWN_CORRUPT_INCORE
which it's found some kind of inconsistency in the free space
btrees. So, likely the same problem that caused EFI recovery to fail
on the other volume.
Are the tests being run on newly made filesystems? If not, have
these filesystems had xfs_repair run on them after a failure? If
so, what is the error that is fixed? If not, does repairing the
filesystem make the problem go away?
Newly made after every error of any kind, whether app, XFS shutdown, call
trace, etc. I've not attempted xfs_repair.
Please do.
Another storage crash yesterday. xfs_repair output inline below for the 7
filesystems. I'm also pasting the dmesg output. This time there is no
oops, no call traces. The filesystems mounted fine after mounting,
replaying, and repairing.
Post by Dave Chinner
Post by Stan Hoeppner
Part of the problem is the storage hardware is a moving target.
They're swapping modules and upgrading firmware every few days.
And I don't have a view into that. So it's difficult to know when
IO problems are due to hardware or buggy code. However, I can
state with certainty that we only run into the XFS problems when
using AIO. And it has occurred on both test rigs, each of which
have their own RAID controllers and disks.
Which, in and of itself doesn't point at AIO or XFS being the
problem. What is says is that something goes wrong under the
extremely high IO load that can be generated with AIO+DIO. That
"something" might be a race in XFS, a bug in AIO, or could be a
load related storage problem.
For example, I've been on the wrong end of hard to track down
problems on beta/early access storage before. For example there was
an incident years ago that took 3 more than 3 months to isolate a
filesystem corruption that occurred under high load. It took that
long to isolate a test case, reproduce it in house on identical
hardware, firmware, software, etc and then *capture it with a FC
analyser*.
The bug? The bleeding edge storage arrays being used had had a
firmware bug in it. When the number of outstanding IOs hit the
*array controller* command tag queue depth limit (some several
thousand simultaneous IOs in flight) it would occasionally misdirect
a single write IO to the *wrong lun*. i.e. it would misdirect a
write.
It was only under *extreme* loads that this would happen, and it's
this sort of load that AIO+DIO can easily generate - you can have
several thousand IOs in flight without too much hassle, and that
will hit limits in the storage arrays that aren't often hit. Array
controller CTQ depth limits are a good example of a limit that
normal IO won't go near to stressing.
I hadn't considered that up to this point. That is *very* insightful, and
applicable, since we are dealing with a beta storage array and firmware.
Worth mentioning is that the storage vendor has added a custom routine
which expends Herculean effort to identify full stripes before writeback.
This because some of our writes for a given low rate stream are as low as
32KB and may be 2-3 seconds apart. With a 64-128KB chunk, 768 to 1536KB
stripe width, we'd get massive RMW without this feature. Testing thus far
shows it is fairly effective, though we still get pretty serious RMW due to
the fact we're writing 350 of these small streams per array at ~72 KB/s
max, along with 2 streams at ~48 MB/s, and and 50 streams at ~1.2 MB/s.
Multiply this by 7 LUNs per controller and it becomes clear we're putting a
pretty serious load on the firmware and cache.
Post by Dave Chinner
Post by Stan Hoeppner
Post by Dave Chinner
Post by Stan Hoeppner
update timestamps for inode 0xf000000a4
Aug 25 23:05:49 Anguish-ssu-1 kernel: [22419.605835] XFS (sdd): failed
to
Post by Dave Chinner
Post by Stan Hoeppner
update timestamps for inode 0x2810f413c
Aug 25 23:05:49 Anguish-ssu-1 kernel: [22419.606169] XFS (sdd): failed
to
Post by Dave Chinner
Post by Stan Hoeppner
update timestamps for inode 0x60000009f
And that is interesting. Makes me wonder if the inode is getting
unlocked on transaction commit failure, or whether there's some
other path in the shutdown code that is not unlocking the inode
correctly.
Is this a separate time stamp from that which noatime disables? We're
mounting with noatime, nodiratime.
Yes. mtime/ctime updates go through this on the write path.
Cheers,
Dave.
dmesg and xfs_repair

[ 9707.147739] end_request: I/O error, dev dm-8, sector 769
[ 9707.147743] end_request: I/O error, dev dm-5, sector 1
[ 9707.147746] end_request: I/O error, dev dm-17, sector 1
[ 9707.147752] end_request: I/O error, dev dm-8, sector 770
[ 9707.147755] end_request: I/O error, dev dm-5, sector 2
[ 9707.147757] end_request: I/O error, dev dm-17, sector 2
[ 9707.147762] end_request: I/O error, dev dm-5, sector 3
[ 9707.147765] end_request: I/O error, dev dm-17, sector 3
[ 9707.147767] end_request: I/O error, dev dm-8, sector 771
[ 9707.147771] end_request: I/O error, dev dm-5, sector 4
[ 9707.147776] end_request: I/O error, dev dm-17, sector 4
[ 9707.147778] end_request: I/O error, dev dm-8, sector 772
[ 9707.147786] end_request: I/O error, dev dm-17, sector 5
[ 9707.147787] end_request: I/O error, dev dm-8, sector 773
[ 9707.147789] end_request: I/O error, dev dm-5, sector 5
[ 9707.147797] end_request: I/O error, dev dm-8, sector 774
[ 9707.147799] end_request: I/O error, dev dm-17, sector 6
[ 9707.147801] end_request: I/O error, dev dm-5, sector 6
[ 9707.147806] end_request: I/O error, dev dm-17, sector 7
[ 9707.147808] end_request: I/O error, dev dm-5, sector 7
[ 9707.147816] end_request: I/O error, dev dm-17, sector 0
[ 9707.147820] end_request: I/O error, dev dm-5, sector 0
[ 9707.147823] end_request: I/O error, dev dm-17, sector 1
[ 9707.147829] end_request: I/O error, dev dm-5, sector 1
[ 9707.147831] end_request: I/O error, dev dm-17, sector 2
[ 9707.147836] end_request: I/O error, dev dm-5, sector 2
[ 9707.147838] end_request: I/O error, dev dm-17, sector 3
[ 9707.147844] end_request: I/O error, dev dm-17, sector 4
[ 9707.147848] end_request: I/O error, dev dm-5, sector 3
[ 9707.147850] end_request: I/O error, dev dm-8, sector 775
[ 9707.147855] end_request: I/O error, dev dm-17, sector 5
[ 9707.147858] end_request: I/O error, dev dm-5, sector 4
[ 9707.147866] end_request: I/O error, dev dm-5, sector 5
[ 9707.147868] end_request: I/O error, dev dm-17, sector 6
[ 9707.147872] end_request: I/O error, dev dm-8, sector 768
[ 9707.147875] end_request: I/O error, dev dm-17, sector 7
[ 9707.147879] end_request: I/O error, dev dm-8, sector 769
[ 9707.147883] end_request: I/O error, dev dm-17, sector 0
[ 9707.147885] end_request: I/O error, dev dm-5, sector 6
[ 9707.147891] end_request: I/O error, dev dm-17, sector 1
[ 9707.147892] end_request: I/O error, dev dm-5, sector 7
[ 9707.147898] end_request: I/O error, dev dm-17, sector 2
[ 9707.147902] end_request: I/O error, dev dm-5, sector 0
[ 9707.147904] end_request: I/O error, dev dm-17, sector 3
[ 9707.147910] end_request: I/O error, dev dm-5, sector 1
[ 9707.147912] end_request: I/O error, dev dm-17, sector 4
[ 9707.147917] end_request: I/O error, dev dm-5, sector 2
[ 9707.147919] end_request: I/O error, dev dm-17, sector 5
[ 9707.147925] end_request: I/O error, dev dm-5, sector 3
[ 9707.147927] end_request: I/O error, dev dm-17, sector 6
[ 9707.147931] end_request: I/O error, dev dm-5, sector 4
[ 9707.147933] end_request: I/O error, dev dm-17, sector 7
[ 9707.147935] end_request: I/O error, dev dm-8, sector 770
[ 9707.147938] end_request: I/O error, dev dm-5, sector 5
[ 9707.147941] end_request: I/O error, dev dm-8, sector 771
[ 9707.147944] end_request: I/O error, dev dm-17, sector 0
[ 9707.147946] end_request: I/O error, dev dm-5, sector 6
[ 9707.147948] end_request: I/O error, dev dm-8, sector 772
[ 9707.147952] end_request: I/O error, dev dm-5, sector 7
[ 9707.147954] end_request: I/O error, dev dm-8, sector 773
[ 9707.147956] end_request: I/O error, dev dm-17, sector 1
[ 9707.147961] end_request: I/O error, dev dm-5, sector 0
[ 9707.147963] end_request: I/O error, dev dm-17, sector 2
[ 9707.147965] end_request: I/O error, dev dm-8, sector 774
[ 9707.147968] end_request: I/O error, dev dm-5, sector 1
[ 9707.147970] end_request: I/O error, dev dm-8, sector 775
[ 9707.147972] end_request: I/O error, dev dm-17, sector 3
[ 9707.147975] end_request: I/O error, dev dm-5, sector 2
[ 9707.147979] end_request: I/O error, dev dm-17, sector 4
[ 9707.147981] end_request: I/O error, dev dm-5, sector 3
[ 9707.147985] end_request: I/O error, dev dm-8, sector 776
[ 9707.147987] end_request: I/O error, dev dm-17, sector 5
[ 9707.147988] end_request: I/O error, dev dm-5, sector 4
[ 9707.147993] end_request: I/O error, dev dm-17, sector 6
[ 9707.147995] end_request: I/O error, dev dm-5, sector 5
[ 9707.147997] end_request: I/O error, dev dm-8, sector 777
[ 9707.148002] end_request: I/O error, dev dm-5, sector 6
[ 9707.148004] end_request: I/O error, dev dm-8, sector 778
[ 9707.148006] end_request: I/O error, dev dm-17, sector 7
[ 9707.148008] end_request: I/O error, dev dm-5, sector 7
[ 9707.148012] end_request: I/O error, dev dm-8, sector 779
[ 9707.148014] end_request: I/O error, dev dm-17, sector 0
[ 9707.148017] end_request: I/O error, dev dm-5, sector 0
[ 9707.148019] end_request: I/O error, dev dm-8, sector 780
[ 9707.148021] end_request: I/O error, dev dm-17, sector 1
[ 9707.148023] end_request: I/O error, dev dm-5, sector 1
[ 9707.148025] end_request: I/O error, dev dm-8, sector 781
[ 9707.148028] end_request: I/O error, dev dm-17, sector 2
[ 9707.148031] end_request: I/O error, dev dm-5, sector 2
[ 9707.148033] end_request: I/O error, dev dm-8, sector 782
[ 9707.148036] end_request: I/O error, dev dm-17, sector 3
[ 9707.148042] end_request: I/O error, dev dm-5, sector 3
[ 9707.148044] end_request: I/O error, dev dm-17, sector 4
[ 9707.148052] end_request: I/O error, dev dm-17, sector 5
[ 9707.148053] end_request: I/O error, dev dm-5, sector 4
[ 9707.148058] end_request: I/O error, dev dm-17, sector 6
[ 9707.148066] end_request: I/O error, dev dm-5, sector 5
[ 9707.148069] end_request: I/O error, dev dm-17, sector 7
[ 9707.148070] end_request: I/O error, dev dm-8, sector 783
[ 9707.148074] end_request: I/O error, dev dm-5, sector 6
[ 9707.148080] end_request: I/O error, dev dm-8, sector 776
[ 9707.148081] end_request: I/O error, dev dm-17, sector 128
[ 9707.148085] end_request: I/O error, dev dm-5, sector 7
[ 9707.148088] end_request: I/O error, dev dm-17, sector 129
[ 9707.148093] end_request: I/O error, dev dm-8, sector 777
[ 9707.148094] end_request: I/O error, dev dm-5, sector 16
[ 9707.148098] end_request: I/O error, dev dm-17, sector 130
[ 9707.148102] end_request: I/O error, dev dm-5, sector 17
[ 9707.148105] end_request: I/O error, dev dm-17, sector 131
[ 9707.148109] end_request: I/O error, dev dm-5, sector 18
[ 9707.148113] end_request: I/O error, dev dm-17, sector 132
[ 9707.148116] end_request: I/O error, dev dm-5, sector 19
[ 9707.148121] end_request: I/O error, dev dm-17, sector 133
[ 9707.148123] end_request: I/O error, dev dm-5, sector 20
[ 9707.148131] end_request: I/O error, dev dm-5, sector 21
[ 9707.148133] end_request: I/O error, dev dm-17, sector 134
[ 9707.148135] end_request: I/O error, dev dm-8, sector 778
[ 9707.148140] end_request: I/O error, dev dm-17, sector 135
[ 9707.148142] end_request: I/O error, dev dm-5, sector 22
[ 9707.148144] end_request: I/O error, dev dm-8, sector 779
[ 9707.148149] end_request: I/O error, dev dm-5, sector 23
[ 9707.148155] end_request: I/O error, dev dm-17, sector 128
[ 9707.148157] end_request: I/O error, dev dm-8, sector 780
[ 9707.148161] end_request: I/O error, dev dm-5, sector 0
[ 9707.148166] end_request: I/O error, dev dm-8, sector 781
[ 9707.148168] end_request: I/O error, dev dm-17, sector 129
[ 9707.148174] end_request: I/O error, dev dm-5, sector 1
[ 9707.148175] end_request: I/O error, dev dm-17, sector 130
[ 9707.148180] end_request: I/O error, dev dm-5, sector 2
[ 9707.148182] end_request: I/O error, dev dm-17, sector 131
[ 9707.148187] end_request: I/O error, dev dm-5, sector 3
[ 9707.148190] end_request: I/O error, dev dm-17, sector 132
[ 9707.148197] end_request: I/O error, dev dm-5, sector 4
[ 9707.148202] end_request: I/O error, dev dm-17, sector 133
[ 9707.148205] end_request: I/O error, dev dm-8, sector 782
[ 9707.148211] end_request: I/O error, dev dm-8, sector 783
[ 9707.148214] end_request: I/O error, dev dm-17, sector 134
[ 9707.148218] end_request: I/O error, dev dm-5, sector 5
[ 9707.148221] end_request: I/O error, dev dm-17, sector 135
[ 9707.148224] end_request: I/O error, dev dm-5, sector 6
[ 9707.148230] end_request: I/O error, dev dm-17, sector 16
[ 9707.148232] end_request: I/O error, dev dm-5, sector 7
[ 9707.148237] end_request: I/O error, dev dm-17, sector 17
[ 9707.148243] end_request: I/O error, dev dm-5, sector 0
[ 9707.148247] end_request: I/O error, dev dm-17, sector 18
[ 9707.148249] end_request: I/O error, dev dm-8, sector 784
[ 9707.148254] end_request: I/O error, dev dm-17, sector 19
[ 9707.148256] end_request: I/O error, dev dm-8, sector 785
[ 9707.148261] end_request: I/O error, dev dm-17, sector 20
[ 9707.148263] end_request: I/O error, dev dm-8, sector 786
[ 9707.148268] end_request: I/O error, dev dm-5, sector 1
[ 9707.148271] end_request: I/O error, dev dm-8, sector 787
[ 9707.148273] end_request: I/O error, dev dm-17, sector 21
[ 9707.148277] end_request: I/O error, dev dm-8, sector 788
[ 9707.148280] end_request: I/O error, dev dm-17, sector 22
[ 9707.148283] end_request: I/O error, dev dm-8, sector 789
[ 9707.148287] end_request: I/O error, dev dm-17, sector 23
[ 9707.148291] end_request: I/O error, dev dm-8, sector 790
[ 9707.148296] end_request: I/O error, dev dm-17, sector 0
[ 9707.148302] end_request: I/O error, dev dm-8, sector 791
[ 9707.148303] end_request: I/O error, dev dm-17, sector 1
[ 9707.148309] end_request: I/O error, dev dm-5, sector 2
[ 9707.148311] end_request: I/O error, dev dm-17, sector 2
[ 9707.148318] end_request: I/O error, dev dm-17, sector 3
[ 9707.148320] end_request: I/O error, dev dm-8, sector 784
[ 9707.148325] end_request: I/O error, dev dm-17, sector 4
[ 9707.148331] end_request: I/O error, dev dm-8, sector 785
[ 9707.148333] end_request: I/O error, dev dm-17, sector 5
[ 9707.148337] end_request: I/O error, dev dm-5, sector 3
[ 9707.148340] end_request: I/O error, dev dm-17, sector 6
[ 9707.148343] end_request: I/O error, dev dm-5, sector 4
[ 9707.148345] end_request: I/O error, dev dm-8, sector 786
[ 9707.148349] end_request: I/O error, dev dm-17, sector 7
[ 9707.148352] end_request: I/O error, dev dm-8, sector 787
[ 9707.148357] end_request: I/O error, dev dm-17, sector 0
[ 9707.148359] end_request: I/O error, dev dm-5, sector 5
[ 9707.148369] end_request: I/O error, dev dm-8, sector 788
[ 9707.148370] end_request: I/O error, dev dm-5, sector 6
[ 9707.148372] end_request: I/O error, dev dm-17, sector 1
[ 9707.148377] end_request: I/O error, dev dm-5, sector 7
[ 9707.148379] end_request: I/O error, dev dm-17, sector 2
[ 9707.148386] end_request: I/O error, dev dm-5, sector 0
[ 9707.148388] end_request: I/O error, dev dm-17, sector 3
[ 9707.148394] end_request: I/O error, dev dm-5, sector 1
[ 9707.148396] end_request: I/O error, dev dm-17, sector 4
[ 9707.148401] end_request: I/O error, dev dm-5, sector 2
[ 9707.148403] end_request: I/O error, dev dm-17, sector 5
[ 9707.148409] end_request: I/O error, dev dm-5, sector 3
[ 9707.148411] end_request: I/O error, dev dm-17, sector 6
[ 9707.148418] end_request: I/O error, dev dm-17, sector 7
[ 9707.148419] end_request: I/O error, dev dm-5, sector 4
[ 9707.148423] end_request: I/O error, dev dm-8, sector 789
[ 9707.148426] end_request: I/O error, dev dm-5, sector 5
[ 9707.148432] end_request: I/O error, dev dm-5, sector 6
[ 9707.148435] end_request: I/O error, dev dm-17, sector 8
[ 9707.148439] end_request: I/O error, dev dm-5, sector 7
[ 9707.148442] end_request: I/O error, dev dm-17, sector 9
[ 9707.148448] end_request: I/O error, dev dm-5, sector 0
[ 9707.148450] end_request: I/O error, dev dm-17, sector 10
[ 9707.148456] end_request: I/O error, dev dm-5, sector 1
[ 9707.148457] end_request: I/O error, dev dm-17, sector 11
[ 9707.148462] end_request: I/O error, dev dm-5, sector 2
[ 9707.148464] end_request: I/O error, dev dm-17, sector 12
[ 9707.148469] end_request: I/O error, dev dm-5, sector 3
[ 9707.148471] end_request: I/O error, dev dm-17, sector 13
[ 9707.148475] end_request: I/O error, dev dm-5, sector 4
[ 9707.148478] end_request: I/O error, dev dm-17, sector 14
[ 9707.148483] end_request: I/O error, dev dm-5, sector 5
[ 9707.148485] end_request: I/O error, dev dm-17, sector 15
[ 9707.148494] end_request: I/O error, dev dm-17, sector 16
[ 9707.148498] end_request: I/O error, dev dm-5, sector 6
[ 9707.148499] end_request: I/O error, dev dm-8, sector 790
[ 9707.148503] end_request: I/O error, dev dm-17, sector 17
[ 9707.148507] end_request: I/O error, dev dm-5, sector 7
[ 9707.148509] end_request: I/O error, dev dm-8, sector 791
[ 9707.148512] end_request: I/O error, dev dm-17, sector 18
[ 9707.148518] end_request: I/O error, dev dm-8, sector 0
[ 9707.148520] end_request: I/O error, dev dm-17, sector 19
[ 9707.148527] end_request: I/O error, dev dm-17, sector 20
[ 9707.148533] end_request: I/O error, dev dm-5, sector 0
[ 9707.148535] end_request: I/O error, dev dm-17, sector 21
[ 9707.148538] end_request: I/O error, dev dm-8, sector 1
[ 9707.148543] end_request: I/O error, dev dm-17, sector 22
[ 9707.148546] end_request: I/O error, dev dm-8, sector 2
[ 9707.148549] end_request: I/O error, dev dm-5, sector 1
[ 9707.148554] end_request: I/O error, dev dm-17, sector 23
[ 9707.148557] end_request: I/O error, dev dm-8, sector 3
[ 9707.148560] end_request: I/O error, dev dm-5, sector 2
[ 9707.148563] end_request: I/O error, dev dm-8, sector 4
[ 9707.148570] end_request: I/O error, dev dm-17, sector 0
[ 9707.148571] end_request: I/O error, dev dm-5, sector 3
[ 9707.148578] end_request: I/O error, dev dm-5, sector 4
[ 9707.148580] end_request: I/O error, dev dm-8, sector 5
[ 9707.148582] end_request: I/O error, dev dm-17, sector 1
[ 9707.148589] end_request: I/O error, dev dm-17, sector 2
[ 9707.148591] end_request: I/O error, dev dm-5, sector 5
[ 9707.148596] end_request: I/O error, dev dm-17, sector 3
[ 9707.148599] end_request: I/O error, dev dm-5, sector 6
[ 9707.148601] end_request: I/O error, dev dm-8, sector 6
[ 9707.148607] end_request: I/O error, dev dm-5, sector 7
[ 9707.148610] end_request: I/O error, dev dm-17, sector 4
[ 9707.148611] end_request: I/O error, dev dm-8, sector 7
[ 9707.148617] end_request: I/O error, dev dm-17, sector 5
[ 9707.148620] end_request: I/O error, dev dm-8, sector 0
[ 9707.148627] end_request: I/O error, dev dm-5, sector 0
[ 9707.148630] end_request: I/O error, dev dm-17, sector 6
[ 9707.148636] end_request: I/O error, dev dm-17, sector 7
[ 9707.148642] end_request: I/O error, dev dm-5, sector 1
[ 9707.148644] end_request: I/O error, dev dm-8, sector 1
[ 9707.148650] end_request: I/O error, dev dm-5, sector 2
[ 9707.148656] end_request: I/O error, dev dm-8, sector 2
[ 9707.148661] end_request: I/O error, dev dm-5, sector 3
[ 9707.148663] end_request: I/O error, dev dm-17, sector 0
[ 9707.148671] end_request: I/O error, dev dm-17, sector 1
[ 9707.148673] end_request: I/O error, dev dm-8, sector 3
[ 9707.148678] end_request: I/O error, dev dm-17, sector 2
[ 9707.148686] end_request: I/O error, dev dm-17, sector 3
[ 9707.148687] end_request: I/O error, dev dm-8, sector 4
[ 9707.148695] end_request: I/O error, dev dm-5, sector 4
[ 9707.148696] end_request: I/O error, dev dm-8, sector 5
[ 9707.148698] end_request: I/O error, dev dm-17, sector 4
[ 9707.148705] end_request: I/O error, dev dm-17, sector 5
[ 9707.148708] end_request: I/O error, dev dm-8, sector 6
[ 9707.148720] end_request: I/O error, dev dm-17, sector 6
[ 9707.148722] end_request: I/O error, dev dm-8, sector 7
[ 9707.148724] end_request: I/O error, dev dm-5, sector 5
[ 9707.148733] end_request: I/O error, dev dm-5, sector 6
[ 9707.148735] end_request: I/O error, dev dm-17, sector 7
[ 9707.148738] end_request: I/O error, dev dm-8, sector 0
[ 9707.148744] end_request: I/O error, dev dm-17, sector 0
[ 9707.148747] end_request: I/O error, dev dm-5, sector 7
[ 9707.148749] end_request: I/O error, dev dm-8, sector 1
[ 9707.148753] end_request: I/O error, dev dm-17, sector 1
[ 9707.148756] end_request: I/O error, dev dm-8, sector 2
[ 9707.148761] end_request: I/O error, dev dm-5, sector 0
[ 9707.148763] end_request: I/O error, dev dm-17, sector 2
[ 9707.148768] end_request: I/O error, dev dm-5, sector 1
[ 9707.148770] end_request: I/O error, dev dm-17, sector 3
[ 9707.148775] end_request: I/O error, dev dm-5, sector 2
[ 9707.148777] end_request: I/O error, dev dm-17, sector 4
[ 9707.148781] end_request: I/O error, dev dm-8, sector 3
[ 9707.148784] end_request: I/O error, dev dm-17, sector 5
[ 9707.148788] end_request: I/O error, dev dm-8, sector 4
[ 9707.148793] end_request: I/O error, dev dm-5, sector 3
[ 9707.148795] end_request: I/O error, dev dm-17, sector 6
[ 9707.148801] end_request: I/O error, dev dm-17, sector 7
[ 9707.148803] end_request: I/O error, dev dm-5, sector 4
[ 9707.148806] end_request: I/O error, dev dm-8, sector 5
[ 9707.148810] end_request: I/O error, dev dm-5, sector 5
[ 9707.148814] end_request: I/O error, dev dm-8, sector 6
[ 9707.148816] end_request: I/O error, dev dm-17, sector 0
[ 9707.148819] end_request: I/O error, dev dm-5, sector 6
[ 9707.148822] end_request: I/O error, dev dm-17, sector 1
[ 9707.148825] end_request: I/O error, dev dm-5, sector 7
[ 9707.148827] end_request: I/O error, dev dm-8, sector 7
[ 9707.148829] end_request: I/O error, dev dm-17, sector 2
[ 9707.148834] end_request: I/O error, dev dm-5, sector 0
[ 9707.148836] end_request: I/O error, dev dm-8, sector 0
[ 9707.148838] end_request: I/O error, dev dm-17, sector 3
[ 9707.148843] end_request: I/O error, dev dm-8, sector 1
[ 9707.148844] end_request: I/O error, dev dm-17, sector 4
[ 9707.148846] end_request: I/O error, dev dm-5, sector 1
[ 9707.148849] end_request: I/O error, dev dm-8, sector 2
[ 9707.148853] end_request: I/O error, dev dm-5, sector 2
[ 9707.148854] end_request: I/O error, dev dm-17, sector 5
[ 9707.148856] end_request: I/O error, dev dm-8, sector 3
[ 9707.148861] end_request: I/O error, dev dm-17, sector 6
[ 9707.148863] end_request: I/O error, dev dm-8, sector 4
[ 9707.148865] end_request: I/O error, dev dm-5, sector 3
[ 9707.148869] end_request: I/O error, dev dm-8, sector 5
[ 9707.148871] end_request: I/O error, dev dm-5, sector 4
[ 9707.148873] end_request: I/O error, dev dm-17, sector 7
[ 9707.148875] end_request: I/O error, dev dm-8, sector 6
[ 9707.148880] end_request: I/O error, dev dm-5, sector 5
[ 9707.148881] end_request: I/O error, dev dm-17, sector 0
[ 9707.148883] end_request: I/O error, dev dm-8, sector 7
[ 9707.148888] end_request: I/O error, dev dm-17, sector 1
[ 9707.148890] end_request: I/O error, dev dm-5, sector 6
[ 9707.148892] end_request: I/O error, dev dm-8, sector 0
[ 9707.148896] end_request: I/O error, dev dm-5, sector 7
[ 9707.148898] end_request: I/O error, dev dm-8, sector 1
[ 9707.148900] end_request: I/O error, dev dm-17, sector 2
[ 9707.148905] end_request: I/O error, dev dm-5, sector 0
[ 9707.148908] end_request: I/O error, dev dm-17, sector 3
[ 9707.148913] end_request: I/O error, dev dm-8, sector 2
[ 9707.148916] end_request: I/O error, dev dm-17, sector 4
[ 9707.148918] end_request: I/O error, dev dm-5, sector 1
[ 9707.148923] end_request: I/O error, dev dm-17, sector 5
[ 9707.148925] end_request: I/O error, dev dm-5, sector 2
[ 9707.148931] end_request: I/O error, dev dm-17, sector 6
[ 9707.148939] end_request: I/O error, dev dm-17, sector 7
[ 9707.148941] end_request: I/O error, dev dm-5, sector 3
[ 9707.148943] end_request: I/O error, dev dm-8, sector 3
[ 9707.148949] end_request: I/O error, dev dm-5, sector 4
[ 9707.148951] end_request: I/O error, dev dm-8, sector 4
[ 9707.148953] end_request: I/O error, dev dm-17, sector 0
[ 9707.148958] end_request: I/O error, dev dm-8, sector 5
[ 9707.148961] end_request: I/O error, dev dm-17, sector 1
[ 9707.148966] end_request: I/O error, dev dm-5, sector 5
[ 9707.148968] end_request: I/O error, dev dm-17, sector 2
[ 9707.148973] end_request: I/O error, dev dm-8, sector 6
[ 9707.148975] end_request: I/O error, dev dm-17, sector 3
[ 9707.148978] end_request: I/O error, dev dm-5, sector 6
[ 9707.148980] end_request: I/O error, dev dm-8, sector 7
[ 9707.148988] end_request: I/O error, dev dm-5, sector 7
[ 9707.148992] end_request: I/O error, dev dm-17, sector 4
[ 9707.148995] end_request: I/O error, dev dm-8, sector 16
[ 9707.149002] end_request: I/O error, dev dm-5, sector 0
[ 9707.149005] end_request: I/O error, dev dm-17, sector 5
[ 9707.149007] end_request: I/O error, dev dm-8, sector 17
[ 9707.149012] end_request: I/O error, dev dm-17, sector 6
[ 9707.149016] end_request: I/O error, dev dm-8, sector 18
[ 9707.149020] end_request: I/O error, dev dm-17, sector 7
[ 9707.149023] end_request: I/O error, dev dm-5, sector 1
[ 9707.149029] end_request: I/O error, dev dm-17, sector 8
[ 9707.149030] end_request: I/O error, dev dm-8, sector 19
[ 9707.149038] end_request: I/O error, dev dm-17, sector 9
[ 9707.149040] end_request: I/O error, dev dm-8, sector 20
[ 9707.149046] end_request: I/O error, dev dm-17, sector 10
[ 9707.149047] end_request: I/O error, dev dm-8, sector 21
[ 9707.149053] end_request: I/O error, dev dm-17, sector 11
[ 9707.149058] end_request: I/O error, dev dm-8, sector 22
[ 9707.149060] end_request: I/O error, dev dm-17, sector 12
[ 9707.149065] end_request: I/O error, dev dm-5, sector 2
[ 9707.149067] end_request: I/O error, dev dm-17, sector 13
[ 9707.149072] end_request: I/O error, dev dm-5, sector 3
[ 9707.149075] end_request: I/O error, dev dm-17, sector 14
[ 9707.149079] end_request: I/O error, dev dm-5, sector 4
[ 9707.149082] end_request: I/O error, dev dm-17, sector 15
[ 9707.149086] end_request: I/O error, dev dm-5, sector 5
[ 9707.149088] end_request: I/O error, dev dm-8, sector 23
[ 9707.149094] end_request: I/O error, dev dm-5, sector 6
[ 9707.149098] end_request: I/O error, dev dm-17, sector 128
[ 9707.149099] end_request: I/O error, dev dm-8, sector 0
[ 9707.149101] end_request: I/O error, dev dm-5, sector 7
[ 9707.149109] end_request: I/O error, dev dm-8, sector 1
[ 9707.149111] end_request: I/O error, dev dm-17, sector 129
[ 9707.149113] end_request: I/O error, dev dm-5, sector 0
[ 9707.149118] end_request: I/O error, dev dm-17, sector 130
[ 9707.149122] end_request: I/O error, dev dm-8, sector 2
[ 9707.149124] end_request: I/O error, dev dm-5, sector 1
[ 9707.149128] end_request: I/O error, dev dm-17, sector 131
[ 9707.149131] end_request: I/O error, dev dm-5, sector 2
[ 9707.149135] end_request: I/O error, dev dm-17, sector 132
[ 9707.149138] end_request: I/O error, dev dm-5, sector 3
[ 9707.149143] end_request: I/O error, dev dm-17, sector 133
[ 9707.149145] end_request: I/O error, dev dm-5, sector 4
[ 9707.149150] end_request: I/O error, dev dm-17, sector 134
[ 9707.149153] end_request: I/O error, dev dm-5, sector 5
[ 9707.149158] end_request: I/O error, dev dm-17, sector 135
[ 9707.149162] end_request: I/O error, dev dm-5, sector 6
[ 9707.149164] end_request: I/O error, dev dm-8, sector 3
[ 9707.149171] end_request: I/O error, dev dm-5, sector 7
[ 9707.149175] end_request: I/O error, dev dm-17, sector 0
[ 9707.149176] end_request: I/O error, dev dm-8, sector 4
[ 9707.149181] end_request: I/O error, dev dm-17, sector 1
[ 9707.149184] end_request: I/O error, dev dm-8, sector 5
[ 9707.149187] end_request: I/O error, dev dm-5, sector 0
[ 9707.149192] end_request: I/O error, dev dm-17, sector 2
[ 9707.149196] end_request: I/O error, dev dm-8, sector 6
[ 9707.149202] end_request: I/O error, dev dm-5, sector 1
[ 9707.149204] end_request: I/O error, dev dm-17, sector 3
[ 9707.149208] end_request: I/O error, dev dm-5, sector 2
[ 9707.149211] end_request: I/O error, dev dm-17, sector 4
[ 9707.149215] end_request: I/O error, dev dm-5, sector 3
[ 9707.149218] end_request: I/O error, dev dm-17, sector 5
[ 9707.149226] end_request: I/O error, dev dm-17, sector 6
[ 9707.149228] end_request: I/O error, dev dm-5, sector 4
[ 9707.149233] end_request: I/O error, dev dm-17, sector 7
[ 9707.149236] end_request: I/O error, dev dm-5, sector 5
[ 9707.149240] end_request: I/O error, dev dm-8, sector 7
[ 9707.149245] end_request: I/O error, dev dm-17, sector 0
[ 9707.149247] end_request: I/O error, dev dm-5, sector 6
[ 9707.149256] end_request: I/O error, dev dm-17, sector 1
[ 9707.149258] end_request: I/O error, dev dm-5, sector 7
[ 9707.149262] end_request: I/O error, dev dm-8, sector 0
[ 9707.149270] end_request: I/O error, dev dm-17, sector 2
[ 9707.149271] end_request: I/O error, dev dm-5, sector 128
[ 9707.149277] end_request: I/O error, dev dm-8, sector 1
[ 9707.149279] end_request: I/O error, dev dm-5, sector 129
[ 9707.149282] end_request: I/O error, dev dm-17, sector 3
[ 9707.149285] end_request: I/O error, dev dm-5, sector 130
[ 9707.149289] end_request: I/O error, dev dm-17, sector 4
[ 9707.149293] end_request: I/O error, dev dm-5, sector 131
[ 9707.149296] end_request: I/O error, dev dm-8, sector 2
[ 9707.149299] end_request: I/O error, dev dm-17, sector 5
[ 9707.149306] end_request: I/O error, dev dm-8, sector 3
[ 9707.149308] end_request: I/O error, dev dm-17, sector 6
[ 9707.149310] end_request: I/O error, dev dm-5, sector 132
[ 9707.149315] end_request: I/O error, dev dm-17, sector 7
[ 9707.149318] end_request: I/O error, dev dm-5, sector 133
[ 9707.149324] end_request: I/O error, dev dm-5, sector 134
[ 9707.149326] end_request: I/O error, dev dm-17, sector 4096
[ 9707.149329] end_request: I/O error, dev dm-8, sector 4
[ 9707.149334] end_request: I/O error, dev dm-17, sector 4097
[ 9707.149336] end_request: I/O error, dev dm-8, sector 5
[ 9707.149341] end_request: I/O error, dev dm-17, sector 4098
[ 9707.149343] end_request: I/O error, dev dm-8, sector 6
[ 9707.149349] end_request: I/O error, dev dm-17, sector 4099
[ 9707.149356] end_request: I/O error, dev dm-5, sector 135
[ 9707.149358] end_request: I/O error, dev dm-8, sector 7
[ 9707.149362] end_request: I/O error, dev dm-17, sector 4100
[ 9707.149369] end_request: I/O error, dev dm-17, sector 4101
[ 9707.149374] end_request: I/O error, dev dm-8, sector 0
[ 9707.149376] end_request: I/O error, dev dm-5, sector 128
[ 9707.149379] end_request: I/O error, dev dm-17, sector 4102
[ 9707.149382] end_request: I/O error, dev dm-5, sector 129
[ 9707.149384] end_request: I/O error, dev dm-8, sector 1
[ 9707.149390] end_request: I/O error, dev dm-17, sector 4103
[ 9707.149392] end_request: I/O error, dev dm-8, sector 2
[ 9707.149396] end_request: I/O error, dev dm-5, sector 130
[ 9707.149403] end_request: I/O error, dev dm-8, sector 3
[ 9707.149404] end_request: I/O error, dev dm-5, sector 131
[ 9707.149410] end_request: I/O error, dev dm-5, sector 132
[ 9707.149416] end_request: I/O error, dev dm-5, sector 133
[ 9707.149423] end_request: I/O error, dev dm-5, sector 134
[ 9707.149431] end_request: I/O error, dev dm-5, sector 135
[ 9707.149433] end_request: I/O error, dev dm-8, sector 4
[ 9707.149440] end_request: I/O error, dev dm-8, sector 5
[ 9707.149445] end_request: I/O error, dev dm-5, sector 16
[ 9707.149452] end_request: I/O error, dev dm-5, sector 17
[ 9707.149458] end_request: I/O error, dev dm-5, sector 18
[ 9707.149466] end_request: I/O error, dev dm-5, sector 19
[ 9707.149467] end_request: I/O error, dev dm-8, sector 6
[ 9707.149475] end_request: I/O error, dev dm-8, sector 7
[ 9707.149476] end_request: I/O error, dev dm-5, sector 20
[ 9707.149483] end_request: I/O error, dev dm-5, sector 21
[ 9707.149490] end_request: I/O error, dev dm-5, sector 22
[ 9707.149493] end_request: I/O error, dev dm-8, sector 0
[ 9707.149498] end_request: I/O error, dev dm-5, sector 23
[ 9707.149503] end_request: I/O error, dev dm-8, sector 1
[ 9707.149514] end_request: I/O error, dev dm-5, sector 0
[ 9707.149515] end_request: I/O error, dev dm-8, sector 2
[ 9707.149522] end_request: I/O error, dev dm-8, sector 3
[ 9707.149528] end_request: I/O error, dev dm-8, sector 4
[ 9707.149534] end_request: I/O error, dev dm-5, sector 1
[ 9707.149539] end_request: I/O error, dev dm-8, sector 5
[ 9707.149544] end_request: I/O error, dev dm-5, sector 2
[ 9707.149546] end_request: I/O error, dev dm-8, sector 6
[ 9707.149550] end_request: I/O error, dev dm-5, sector 3
[ 9707.149552] end_request: I/O error, dev dm-8, sector 7
[ 9707.149557] end_request: I/O error, dev dm-5, sector 4
[ 9707.149560] end_request: I/O error, dev dm-8, sector 0
[ 9707.149563] end_request: I/O error, dev dm-5, sector 5
[ 9707.149567] end_request: I/O error, dev dm-8, sector 1
[ 9707.149569] end_request: I/O error, dev dm-5, sector 6
[ 9707.149573] end_request: I/O error, dev dm-8, sector 2
[ 9707.149575] end_request: I/O error, dev dm-5, sector 7
[ 9707.149580] end_request: I/O error, dev dm-8, sector 3
[ 9707.149583] end_request: I/O error, dev dm-5, sector 0
[ 9707.149586] end_request: I/O error, dev dm-8, sector 4
[ 9707.149590] end_request: I/O error, dev dm-5, sector 1
[ 9707.149592] end_request: I/O error, dev dm-8, sector 5
[ 9707.149596] end_request: I/O error, dev dm-5, sector 2
[ 9707.149599] end_request: I/O error, dev dm-8, sector 6
[ 9707.149606] end_request: I/O error, dev dm-5, sector 3
[ 9707.149608] end_request: I/O error, dev dm-8, sector 7
[ 9707.149613] end_request: I/O error, dev dm-5, sector 4
[ 9707.149625] end_request: I/O error, dev dm-5, sector 5
[ 9707.149627] end_request: I/O error, dev dm-8, sector 0
[ 9707.149636] end_request: I/O error, dev dm-8, sector 1
[ 9707.149642] end_request: I/O error, dev dm-5, sector 6
[ 9707.149651] end_request: I/O error, dev dm-5, sector 7
[ 9707.149660] end_request: I/O error, dev dm-8, sector 2
[ 9707.149673] end_request: I/O error, dev dm-5, sector 8
[ 9707.149675] end_request: I/O error, dev dm-8, sector 3
[ 9707.149682] end_request: I/O error, dev dm-8, sector 4
[ 9707.149689] end_request: I/O error, dev dm-5, sector 9
[ 9707.149697] end_request: I/O error, dev dm-8, sector 5
[ 9707.149706] end_request: I/O error, dev dm-5, sector 10
[ 9707.149709] end_request: I/O error, dev dm-5, sector 11
[ 9707.149712] end_request: I/O error, dev dm-5, sector 12
[ 9707.149714] end_request: I/O error, dev dm-5, sector 13
[ 9707.149717] end_request: I/O error, dev dm-5, sector 14
[ 9707.149719] end_request: I/O error, dev dm-5, sector 15
[ 9707.149735] end_request: I/O error, dev dm-8, sector 6
[ 9707.149738] end_request: I/O error, dev dm-5, sector 16
[ 9707.149749] end_request: I/O error, dev dm-5, sector 17
[ 9707.149752] end_request: I/O error, dev dm-8, sector 7
[ 9707.149763] end_request: I/O error, dev dm-5, sector 18
[ 9707.149776] end_request: I/O error, dev dm-8, sector 0
[ 9707.149778] end_request: I/O error, dev dm-5, sector 19
[ 9707.149784] end_request: I/O error, dev dm-5, sector 20
[ 9707.149790] end_request: I/O error, dev dm-5, sector 21
[ 9707.149797] end_request: I/O error, dev dm-5, sector 22
[ 9707.149805] end_request: I/O error, dev dm-5, sector 23
[ 9707.149810] end_request: I/O error, dev dm-8, sector 1
[ 9707.149819] end_request: I/O error, dev dm-8, sector 2
[ 9707.149821] end_request: I/O error, dev dm-5, sector 0
[ 9707.149833] end_request: I/O error, dev dm-5, sector 1
[ 9707.149834] end_request: I/O error, dev dm-8, sector 3
[ 9707.149842] end_request: I/O error, dev dm-8, sector 4
[ 9707.149844] end_request: I/O error, dev dm-5, sector 2
[ 9707.149850] end_request: I/O error, dev dm-5, sector 3
[ 9707.149852] end_request: I/O error, dev dm-8, sector 5
[ 9707.149859] end_request: I/O error, dev dm-5, sector 4
[ 9707.149862] end_request: I/O error, dev dm-8, sector 6
[ 9707.149871] end_request: I/O error, dev dm-5, sector 5
[ 9707.149873] end_request: I/O error, dev dm-8, sector 7
[ 9707.149879] end_request: I/O error, dev dm-5, sector 6
[ 9707.149891] end_request: I/O error, dev dm-5, sector 7
[ 9707.149892] end_request: I/O error, dev dm-8, sector 0
[ 9707.149903] end_request: I/O error, dev dm-8, sector 1
[ 9707.149906] end_request: I/O error, dev dm-5, sector 0
[ 9707.149910] end_request: I/O error, dev dm-8, sector 2
[ 9707.149914] end_request: I/O error, dev dm-5, sector 1
[ 9707.149918] end_request: I/O error, dev dm-8, sector 3
[ 9707.149921] end_request: I/O error, dev dm-5, sector 2
[ 9707.149929] end_request: I/O error, dev dm-5, sector 3
[ 9707.149940] end_request: I/O error, dev dm-5, sector 4
[ 9707.149942] end_request: I/O error, dev dm-8, sector 4
[ 9707.149952] end_request: I/O error, dev dm-8, sector 5
[ 9707.149954] end_request: I/O error, dev dm-5, sector 5
[ 9707.149960] end_request: I/O error, dev dm-5, sector 6
[ 9707.149967] end_request: I/O error, dev dm-5, sector 7
[ 9707.149977] end_request: I/O error, dev dm-5, sector 0
[ 9707.149983] end_request: I/O error, dev dm-5, sector 1
[ 9707.149992] end_request: I/O error, dev dm-5, sector 2
[ 9707.149998] end_request: I/O error, dev dm-8, sector 6
[ 9707.150006] end_request: I/O error, dev dm-5, sector 3
[ 9707.150012] end_request: I/O error, dev dm-5, sector 4
[ 9707.150019] end_request: I/O error, dev dm-5, sector 5
[ 9707.150023] end_request: I/O error, dev dm-8, sector 7
[ 9707.150030] end_request: I/O error, dev dm-5, sector 6
[ 9707.150031] end_request: I/O error, dev dm-8, sector 0
[ 9707.150038] end_request: I/O error, dev dm-8, sector 1
[ 9707.150039] end_request: I/O error, dev dm-5, sector 7
[ 9707.150045] end_request: I/O error, dev dm-8, sector 2
[ 9707.150048] end_request: I/O error, dev dm-5, sector 0
[ 9707.150051] end_request: I/O error, dev dm-8, sector 3
[ 9707.150054] end_request: I/O error, dev dm-5, sector 1
[ 9707.150058] end_request: I/O error, dev dm-8, sector 4
[ 9707.150061] end_request: I/O error, dev dm-5, sector 2
[ 9707.150064] end_request: I/O error, dev dm-8, sector 5
[ 9707.150067] end_request: I/O error, dev dm-5, sector 3
[ 9707.150070] end_request: I/O error, dev dm-8, sector 6
[ 9707.150073] end_request: I/O error, dev dm-5, sector 4
[ 9707.150076] end_request: I/O error, dev dm-8, sector 7
[ 9707.150079] end_request: I/O error, dev dm-5, sector 5
[ 9707.150085] end_request: I/O error, dev dm-8, sector 0
[ 9707.150086] end_request: I/O error, dev dm-5, sector 6
[ 9707.150091] end_request: I/O error, dev dm-5, sector 7
[ 9707.150093] end_request: I/O error, dev dm-8, sector 1
[ 9707.150103] end_request: I/O error, dev dm-5, sector 0
[ 9707.150109] end_request: I/O error, dev dm-8, sector 2
[ 9707.150111] end_request: I/O error, dev dm-5, sector 1
[ 9707.150118] end_request: I/O error, dev dm-5, sector 2
[ 9707.150128] end_request: I/O error, dev dm-5, sector 3
[ 9707.150130] end_request: I/O error, dev dm-8, sector 3
[ 9707.150141] end_request: I/O error, dev dm-5, sector 4
[ 9707.150143] end_request: I/O error, dev dm-8, sector 4
[ 9707.150149] end_request: I/O error, dev dm-8, sector 5
[ 9707.150152] end_request: I/O error, dev dm-5, sector 5
[ 9707.150158] end_request: I/O error, dev dm-5, sector 6
[ 9707.150164] end_request: I/O error, dev dm-5, sector 7
[ 9707.150173] end_request: I/O error, dev dm-8, sector 6
[ 9707.150176] end_request: I/O error, dev dm-5, sector 0
[ 9707.150182] end_request: I/O error, dev dm-5, sector 1
[ 9707.150189] end_request: I/O error, dev dm-8, sector 7
[ 9707.150191] end_request: I/O error, dev dm-5, sector 2
[ 9707.150198] end_request: I/O error, dev dm-5, sector 3
[ 9707.150204] end_request: I/O error, dev dm-5, sector 4
[ 9707.150210] end_request: I/O error, dev dm-5, sector 5
[ 9707.150217] end_request: I/O error, dev dm-5, sector 6
[ 9707.150223] end_request: I/O error, dev dm-5, sector 7
[ 9707.150226] end_request: I/O error, dev dm-8, sector 0
[ 9707.150233] end_request: I/O error, dev dm-5, sector 8
[ 9707.150239] end_request: I/O error, dev dm-8, sector 1
[ 9707.150244] end_request: I/O error, dev dm-5, sector 9
[ 9707.150250] end_request: I/O error, dev dm-5, sector 10
[ 9707.150257] end_request: I/O error, dev dm-5, sector 11
[ 9707.150262] end_request: I/O error, dev dm-8, sector 2
[ 9707.150264] end_request: I/O error, dev dm-5, sector 12
[ 9707.150270] end_request: I/O error, dev dm-5, sector 13
[ 9707.150276] end_request: I/O error, dev dm-5, sector 14
[ 9707.150285] end_request: I/O error, dev dm-5, sector 15
[ 9707.150294] end_request: I/O error, dev dm-5, sector 128
[ 9707.150300] end_request: I/O error, dev dm-5, sector 129
[ 9707.150307] end_request: I/O error, dev dm-5, sector 130
[ 9707.150309] end_request: I/O error, dev dm-8, sector 3
[ 9707.150319] end_request: I/O error, dev dm-8, sector 4
[ 9707.150326] end_request: I/O error, dev dm-5, sector 131
[ 9707.150331] end_request: I/O error, dev dm-8, sector 5
[ 9707.150335] end_request: I/O error, dev dm-5, sector 132
[ 9707.150341] end_request: I/O error, dev dm-5, sector 133
[ 9707.150347] end_request: I/O error, dev dm-5, sector 134
[ 9707.150356] end_request: I/O error, dev dm-5, sector 135
[ 9707.150358] end_request: I/O error, dev dm-8, sector 6
[ 9707.150365] end_request: I/O error, dev dm-5, sector 0
[ 9707.150368] end_request: I/O error, dev dm-8, sector 7
[ 9707.150378] end_request: I/O error, dev dm-5, sector 1
[ 9707.150381] end_request: I/O error, dev dm-8, sector 0
[ 9707.150390] end_request: I/O error, dev dm-5, sector 2
[ 9707.150392] end_request: I/O error, dev dm-8, sector 1
[ 9707.150398] end_request: I/O error, dev dm-8, sector 2
[ 9707.150400] end_request: I/O error, dev dm-5, sector 3
[ 9707.150407] end_request: I/O error, dev dm-5, sector 4
[ 9707.150418] end_request: I/O error, dev dm-5, sector 5
[ 9707.150421] end_request: I/O error, dev dm-8, sector 3
[ 9707.150430] end_request: I/O error, dev dm-5, sector 6
[ 9707.150431] end_request: I/O error, dev dm-8, sector 4
[ 9707.150437] end_request: I/O error, dev dm-8, sector 5
[ 9707.150438] end_request: I/O error, dev dm-5, sector 7
[ 9707.150447] end_request: I/O error, dev dm-5, sector 0
[ 9707.150453] end_request: I/O error, dev dm-8, sector 6
[ 9707.150455] end_request: I/O error, dev dm-5, sector 1
[ 9707.150461] end_request: I/O error, dev dm-5, sector 2
[ 9707.150465] end_request: I/O error, dev dm-8, sector 7
[ 9707.150467] end_request: I/O error, dev dm-5, sector 3
[ 9707.150473] end_request: I/O error, dev dm-5, sector 4
[ 9707.150474] end_request: I/O error, dev dm-8, sector 128
[ 9707.150480] end_request: I/O error, dev dm-5, sector 5
[ 9707.150482] end_request: I/O error, dev dm-8, sector 129
[ 9707.150486] end_request: I/O error, dev dm-5, sector 6
[ 9707.150488] end_request: I/O error, dev dm-8, sector 130
[ 9707.150492] end_request: I/O error, dev dm-5, sector 7
[ 9707.150495] end_request: I/O error, dev dm-8, sector 131
[ 9707.150501] end_request: I/O error, dev dm-8, sector 132
[ 9707.150506] end_request: I/O error, dev dm-5, sector 4096
[ 9707.150507] end_request: I/O error, dev dm-8, sector 133
[ 9707.150513] end_request: I/O error, dev dm-8, sector 134
[ 9707.150515] end_request: I/O error, dev dm-5, sector 4097
[ 9707.150520] end_request: I/O error, dev dm-5, sector 4098
[ 9707.150522] end_request: I/O error, dev dm-8, sector 135
[ 9707.150528] end_request: I/O error, dev dm-5, sector 4099
[ 9707.150530] end_request: I/O error, dev dm-8, sector 128
[ 9707.150537] end_request: I/O error, dev dm-5, sector 4100
[ 9707.150546] end_request: I/O error, dev dm-8, sector 129
[ 9707.150548] end_request: I/O error, dev dm-5, sector 4101
[ 9707.150555] end_request: I/O error, dev dm-5, sector 4102
[ 9707.150563] end_request: I/O error, dev dm-5, sector 4103
[ 9707.150565] end_request: I/O error, dev dm-8, sector 130
[ 9707.150590] end_request: I/O error, dev dm-8, sector 131
[ 9707.150613] end_request: I/O error, dev dm-8, sector 132
[ 9707.150658] end_request: I/O error, dev dm-8, sector 133
[ 9707.150665] end_request: I/O error, dev dm-8, sector 134
[ 9707.150685] end_request: I/O error, dev dm-8, sector 135
[ 9707.150696] end_request: I/O error, dev dm-8, sector 16
[ 9707.150708] end_request: I/O error, dev dm-8, sector 17
[ 9707.150715] end_request: I/O error, dev dm-8, sector 18
[ 9707.150728] end_request: I/O error, dev dm-8, sector 19
[ 9707.150746] end_request: I/O error, dev dm-8, sector 20
[ 9707.150758] end_request: I/O error, dev dm-8, sector 21
[ 9707.150771] end_request: I/O error, dev dm-8, sector 22
[ 9707.150790] end_request: I/O error, dev dm-8, sector 23
[ 9707.150802] end_request: I/O error, dev dm-8, sector 0
[ 9707.150809] end_request: I/O error, dev dm-8, sector 1
[ 9707.150815] end_request: I/O error, dev dm-8, sector 2
[ 9707.150821] end_request: I/O error, dev dm-8, sector 3
[ 9707.150827] end_request: I/O error, dev dm-8, sector 4
[ 9707.150833] end_request: I/O error, dev dm-8, sector 5
[ 9707.150839] end_request: I/O error, dev dm-8, sector 6
[ 9707.150845] end_request: I/O error, dev dm-8, sector 7
[ 9707.150853] end_request: I/O error, dev dm-8, sector 0
[ 9707.150860] end_request: I/O error, dev dm-8, sector 1
[ 9707.150872] end_request: I/O error, dev dm-8, sector 2
[ 9707.150879] end_request: I/O error, dev dm-8, sector 3
[ 9707.150896] end_request: I/O error, dev dm-8, sector 4
[ 9707.150906] end_request: I/O error, dev dm-8, sector 5
[ 9707.150926] end_request: I/O error, dev dm-8, sector 6
[ 9707.150936] end_request: I/O error, dev dm-8, sector 7
[ 9707.150946] end_request: I/O error, dev dm-8, sector 8
[ 9707.150954] end_request: I/O error, dev dm-8, sector 9
[ 9707.150971] end_request: I/O error, dev dm-8, sector 10
[ 9707.150999] end_request: I/O error, dev dm-8, sector 11
[ 9707.151016] end_request: I/O error, dev dm-8, sector 12
[ 9707.151033] end_request: I/O error, dev dm-8, sector 13
[ 9707.151045] end_request: I/O error, dev dm-8, sector 14
[ 9707.151057] end_request: I/O error, dev dm-8, sector 15
[ 9707.151078] end_request: I/O error, dev dm-8, sector 16
[ 9707.151085] end_request: I/O error, dev dm-8, sector 17
[ 9707.151092] end_request: I/O error, dev dm-8, sector 18
[ 9707.151098] end_request: I/O error, dev dm-8, sector 19
[ 9707.151104] end_request: I/O error, dev dm-8, sector 20
[ 9707.151110] end_request: I/O error, dev dm-8, sector 21
[ 9707.151116] end_request: I/O error, dev dm-8, sector 22
[ 9707.151122] end_request: I/O error, dev dm-8, sector 23
[ 9707.151130] end_request: I/O error, dev dm-8, sector 0
[ 9707.151136] end_request: I/O error, dev dm-8, sector 1
[ 9707.151142] end_request: I/O error, dev dm-8, sector 2
[ 9707.151148] end_request: I/O error, dev dm-8, sector 3
[ 9707.151154] end_request: I/O error, dev dm-8, sector 4
[ 9707.151169] end_request: I/O error, dev dm-8, sector 5
[ 9707.151183] end_request: I/O error, dev dm-8, sector 6
[ 9707.151196] end_request: I/O error, dev dm-8, sector 7
[ 9707.151207] end_request: I/O error, dev dm-8, sector 0
[ 9707.151216] end_request: I/O error, dev dm-8, sector 1
[ 9707.151232] end_request: I/O error, dev dm-8, sector 2
[ 9707.151242] end_request: I/O error, dev dm-8, sector 3
[ 9707.151273] end_request: I/O error, dev dm-8, sector 4
[ 9707.151302] end_request: I/O error, dev dm-8, sector 5
[ 9707.151316] end_request: I/O error, dev dm-8, sector 6
[ 9707.151343] end_request: I/O error, dev dm-8, sector 7
[ 9707.151355] end_request: I/O error, dev dm-8, sector 0
[ 9707.151368] end_request: I/O error, dev dm-8, sector 1
[ 9707.151396] end_request: I/O error, dev dm-8, sector 2
[ 9707.151408] end_request: I/O error, dev dm-8, sector 3
[ 9707.151414] end_request: I/O error, dev dm-8, sector 4
[ 9707.151420] end_request: I/O error, dev dm-8, sector 5
[ 9707.151426] end_request: I/O error, dev dm-8, sector 6
[ 9707.151432] end_request: I/O error, dev dm-8, sector 7
[ 9707.151440] end_request: I/O error, dev dm-8, sector 0
[ 9707.151446] end_request: I/O error, dev dm-8, sector 1
[ 9707.151452] end_request: I/O error, dev dm-8, sector 2
[ 9707.151458] end_request: I/O error, dev dm-8, sector 3
[ 9707.151464] end_request: I/O error, dev dm-8, sector 4
[ 9707.151470] end_request: I/O error, dev dm-8, sector 5
[ 9707.151478] end_request: I/O error, dev dm-8, sector 6
[ 9707.151486] end_request: I/O error, dev dm-8, sector 7
[ 9707.151495] end_request: I/O error, dev dm-8, sector 0
[ 9707.151520] end_request: I/O error, dev dm-8, sector 1
[ 9707.151543] end_request: I/O error, dev dm-8, sector 2
[ 9707.151551] end_request: I/O error, dev dm-8, sector 3
[ 9707.151569] end_request: I/O error, dev dm-8, sector 4
[ 9707.151582] end_request: I/O error, dev dm-8, sector 5
[ 9707.151591] end_request: I/O error, dev dm-8, sector 6
[ 9707.151606] end_request: I/O error, dev dm-8, sector 7
[ 9707.151622] end_request: I/O error, dev dm-8, sector 0
[ 9707.151632] end_request: I/O error, dev dm-8, sector 1
[ 9707.151654] end_request: I/O error, dev dm-8, sector 2
[ 9707.151669] end_request: I/O error, dev dm-8, sector 3
[ 9707.151681] end_request: I/O error, dev dm-8, sector 4
[ 9707.151695] end_request: I/O error, dev dm-8, sector 5
[ 9707.151701] end_request: I/O error, dev dm-8, sector 6
[ 9707.151707] end_request: I/O error, dev dm-8, sector 7
[ 9707.151716] end_request: I/O error, dev dm-8, sector 8
[ 9707.151722] end_request: I/O error, dev dm-8, sector 9
[ 9707.151728] end_request: I/O error, dev dm-8, sector 10
[ 9707.151734] end_request: I/O error, dev dm-8, sector 11
[ 9707.151740] end_request: I/O error, dev dm-8, sector 12
[ 9707.151746] end_request: I/O error, dev dm-8, sector 13
[ 9707.151752] end_request: I/O error, dev dm-8, sector 14
[ 9707.151758] end_request: I/O error, dev dm-8, sector 15
[ 9707.151766] end_request: I/O error, dev dm-8, sector 128
[ 9707.151790] end_request: I/O error, dev dm-8, sector 129
[ 9707.151805] end_request: I/O error, dev dm-8, sector 130
[ 9707.151815] end_request: I/O error, dev dm-8, sector 131
[ 9707.151837] end_request: I/O error, dev dm-8, sector 132
[ 9707.151846] end_request: I/O error, dev dm-8, sector 133
[ 9707.151872] end_request: I/O error, dev dm-8, sector 134
[ 9707.151881] end_request: I/O error, dev dm-8, sector 135
[ 9707.151902] end_request: I/O error, dev dm-8, sector 0
[ 9707.151925] end_request: I/O error, dev dm-8, sector 1
[ 9707.151946] end_request: I/O error, dev dm-8, sector 2
[ 9707.151959] end_request: I/O error, dev dm-8, sector 3
[ 9707.151968] end_request: I/O error, dev dm-8, sector 4
[ 9707.151977] end_request: I/O error, dev dm-8, sector 5
[ 9707.151989] end_request: I/O error, dev dm-8, sector 6
[ 9707.151998] end_request: I/O error, dev dm-8, sector 7
[ 9707.152006] end_request: I/O error, dev dm-8, sector 0
[ 9707.152013] end_request: I/O error, dev dm-8, sector 1
[ 9707.152019] end_request: I/O error, dev dm-8, sector 2
[ 9707.152025] end_request: I/O error, dev dm-8, sector 3
[ 9707.152030] end_request: I/O error, dev dm-8, sector 4
[ 9707.152036] end_request: I/O error, dev dm-8, sector 5
[ 9707.152042] end_request: I/O error, dev dm-8, sector 6
[ 9707.152048] end_request: I/O error, dev dm-8, sector 7
[ 9707.152058] end_request: I/O error, dev dm-8, sector 4096
[ 9707.152064] end_request: I/O error, dev dm-8, sector 4097
[ 9707.152073] end_request: I/O error, dev dm-8, sector 4098
[ 9707.152081] end_request: I/O error, dev dm-8, sector 4099
[ 9707.152091] end_request: I/O error, dev dm-8, sector 4100
[ 9707.152106] end_request: I/O error, dev dm-8, sector 4101
[ 9707.152115] end_request: I/O error, dev dm-8, sector 4102
[ 9707.152133] end_request: I/O error, dev dm-8, sector 4103
[ 9707.184511] end_request: I/O error, dev dm-4, sector 0
[ 9707.184523] end_request: I/O error, dev dm-4, sector 1
[ 9707.184530] end_request: I/O error, dev dm-4, sector 2
[ 9707.184537] end_request: I/O error, dev dm-4, sector 3
[ 9707.184543] end_request: I/O error, dev dm-4, sector 4
[ 9707.184549] end_request: I/O error, dev dm-4, sector 5
[ 9707.184555] end_request: I/O error, dev dm-4, sector 6
[ 9707.184561] end_request: I/O error, dev dm-4, sector 7
[ 9707.184576] end_request: I/O error, dev dm-4, sector 93674283000
[ 9707.184586] end_request: I/O error, dev dm-4, sector 93674283001
[ 9707.184599] end_request: I/O error, dev dm-4, sector 93674283002
[ 9707.184605] end_request: I/O error, dev dm-4, sector 93674283003
[ 9707.184612] end_request: I/O error, dev dm-4, sector 93674283004
[ 9707.184618] end_request: I/O error, dev dm-4, sector 93674283005
[ 9707.184624] end_request: I/O error, dev dm-4, sector 93674283006
[ 9707.184630] end_request: I/O error, dev dm-4, sector 93674283007
[ 9707.184639] end_request: I/O error, dev dm-4, sector 0
[ 9707.184645] end_request: I/O error, dev dm-4, sector 1
[ 9707.184651] end_request: I/O error, dev dm-4, sector 2
[ 9707.184657] end_request: I/O error, dev dm-4, sector 3
[ 9707.184663] end_request: I/O error, dev dm-4, sector 4
[ 9707.184669] end_request: I/O error, dev dm-4, sector 5
[ 9707.184675] end_request: I/O error, dev dm-4, sector 6
[ 9707.184681] end_request: I/O error, dev dm-4, sector 7
[ 9707.184697] end_request: I/O error, dev dm-4, sector 0
[ 9707.184704] end_request: I/O error, dev dm-4, sector 1
[ 9707.184710] end_request: I/O error, dev dm-4, sector 2
[ 9707.184716] end_request: I/O error, dev dm-4, sector 3
[ 9707.184722] end_request: I/O error, dev dm-4, sector 4
[ 9707.184728] end_request: I/O error, dev dm-4, sector 5
[ 9707.184734] end_request: I/O error, dev dm-4, sector 6
[ 9707.184739] end_request: I/O error, dev dm-4, sector 7
[ 9707.184749] end_request: I/O error, dev dm-4, sector 24
[ 9707.184755] end_request: I/O error, dev dm-4, sector 25
[ 9707.184761] end_request: I/O error, dev dm-4, sector 26
[ 9707.184767] end_request: I/O error, dev dm-4, sector 27
[ 9707.184773] end_request: I/O error, dev dm-4, sector 28
[ 9707.184779] end_request: I/O error, dev dm-4, sector 29
[ 9707.184785] end_request: I/O error, dev dm-4, sector 30
[ 9707.184791] end_request: I/O error, dev dm-4, sector 31
[ 9707.209178] end_request: I/O error, dev dm-4, sector 0
[ 9707.209190] end_request: I/O error, dev dm-4, sector 1
[ 9707.209197] end_request: I/O error, dev dm-4, sector 2
[ 9707.209203] end_request: I/O error, dev dm-4, sector 3
[ 9707.209209] end_request: I/O error, dev dm-4, sector 4
[ 9707.209215] end_request: I/O error, dev dm-4, sector 5
[ 9707.209221] end_request: I/O error, dev dm-4, sector 6
[ 9707.209226] end_request: I/O error, dev dm-4, sector 7
[ 9707.209237] end_request: I/O error, dev dm-4, sector 0
[ 9707.209243] end_request: I/O error, dev dm-4, sector 1
[ 9707.209249] end_request: I/O error, dev dm-4, sector 2
[ 9707.209255] end_request: I/O error, dev dm-4, sector 3
[ 9707.209262] end_request: I/O error, dev dm-4, sector 4
[ 9707.209273] end_request: I/O error, dev dm-4, sector 5
[ 9707.209284] end_request: I/O error, dev dm-4, sector 6
[ 9707.209293] end_request: I/O error, dev dm-4, sector 7
[ 9707.209302] end_request: I/O error, dev dm-4, sector 0
[ 9707.209308] end_request: I/O error, dev dm-4, sector 1
[ 9707.209314] end_request: I/O error, dev dm-4, sector 2
[ 9707.209320] end_request: I/O error, dev dm-4, sector 3
[ 9707.209325] end_request: I/O error, dev dm-4, sector 4
[ 9707.209331] end_request: I/O error, dev dm-4, sector 5
[ 9707.209338] end_request: I/O error, dev dm-4, sector 6
[ 9707.209343] end_request: I/O error, dev dm-4, sector 7
[ 9707.209351] end_request: I/O error, dev dm-4, sector 0
[ 9707.209357] end_request: I/O error, dev dm-4, sector 1
[ 9707.209367] end_request: I/O error, dev dm-4, sector 2
[ 9707.209378] end_request: I/O error, dev dm-4, sector 3
[ 9707.209388] end_request: I/O error, dev dm-4, sector 4
[ 9707.209394] end_request: I/O error, dev dm-4, sector 5
[ 9707.209400] end_request: I/O error, dev dm-4, sector 6
[ 9707.209406] end_request: I/O error, dev dm-4, sector 7
[ 9707.209418] end_request: I/O error, dev dm-4, sector 0
[ 9707.209424] end_request: I/O error, dev dm-4, sector 1
[ 9707.209430] end_request: I/O error, dev dm-4, sector 2
[ 9707.209436] end_request: I/O error, dev dm-4, sector 3
[ 9707.209441] end_request: I/O error, dev dm-4, sector 4
[ 9707.209447] end_request: I/O error, dev dm-4, sector 5
[ 9707.209453] end_request: I/O error, dev dm-4, sector 6
[ 9707.209458] end_request: I/O error, dev dm-4, sector 7
[ 9707.209466] end_request: I/O error, dev dm-4, sector 0
[ 9707.209472] end_request: I/O error, dev dm-4, sector 1
[ 9707.209478] end_request: I/O error, dev dm-4, sector 2
[ 9707.209483] end_request: I/O error, dev dm-4, sector 3
[ 9707.209489] end_request: I/O error, dev dm-4, sector 4
[ 9707.209495] end_request: I/O error, dev dm-4, sector 5
[ 9707.209501] end_request: I/O error, dev dm-4, sector 6
[ 9707.209506] end_request: I/O error, dev dm-4, sector 7
[ 9707.209514] end_request: I/O error, dev dm-4, sector 0
[ 9707.209532] end_request: I/O error, dev dm-4, sector 1
[ 9707.209538] end_request: I/O error, dev dm-4, sector 2
[ 9707.209544] end_request: I/O error, dev dm-4, sector 3
[ 9707.209550] end_request: I/O error, dev dm-4, sector 4
[ 9707.209555] end_request: I/O error, dev dm-4, sector 5
[ 9707.209561] end_request: I/O error, dev dm-4, sector 6
[ 9707.209567] end_request: I/O error, dev dm-4, sector 7
[ 9707.209575] end_request: I/O error, dev dm-4, sector 0
[ 9707.209581] end_request: I/O error, dev dm-4, sector 1
[ 9707.209587] end_request: I/O error, dev dm-4, sector 2
[ 9707.209593] end_request: I/O error, dev dm-4, sector 3
[ 9707.209599] end_request: I/O error, dev dm-4, sector 4
[ 9707.209604] end_request: I/O error, dev dm-4, sector 5
[ 9707.209610] end_request: I/O error, dev dm-4, sector 6
[ 9707.209616] end_request: I/O error, dev dm-4, sector 7
[ 9707.209623] end_request: I/O error, dev dm-4, sector 0
[ 9707.209632] end_request: I/O error, dev dm-4, sector 1
[ 9707.209637] end_request: I/O error, dev dm-4, sector 2
[ 9707.209643] end_request: I/O error, dev dm-4, sector 3
[ 9707.209649] end_request: I/O error, dev dm-4, sector 4
[ 9707.209654] end_request: I/O error, dev dm-4, sector 5
[ 9707.209660] end_request: I/O error, dev dm-4, sector 6
[ 9707.209665] end_request: I/O error, dev dm-4, sector 7
[ 9707.209673] end_request: I/O error, dev dm-4, sector 56
[ 9707.209679] end_request: I/O error, dev dm-4, sector 57
[ 9707.209685] end_request: I/O error, dev dm-4, sector 58
[ 9707.209691] end_request: I/O error, dev dm-4, sector 59
[ 9707.209696] end_request: I/O error, dev dm-4, sector 60
[ 9707.209702] end_request: I/O error, dev dm-4, sector 61
[ 9707.209708] end_request: I/O error, dev dm-4, sector 62
[ 9707.209713] end_request: I/O error, dev dm-4, sector 63
[ 9707.209721] end_request: I/O error, dev dm-4, sector 0
[ 9707.209727] end_request: I/O error, dev dm-4, sector 1
[ 9707.209732] end_request: I/O error, dev dm-4, sector 2
[ 9707.209738] end_request: I/O error, dev dm-4, sector 3
[ 9707.209744] end_request: I/O error, dev dm-4, sector 4
[ 9707.209749] end_request: I/O error, dev dm-4, sector 5
[ 9707.209755] end_request: I/O error, dev dm-4, sector 6
[ 9707.209761] end_request: I/O error, dev dm-4, sector 7
[ 9707.209768] end_request: I/O error, dev dm-4, sector 0
[ 9707.209774] end_request: I/O error, dev dm-4, sector 1
[ 9707.209780] end_request: I/O error, dev dm-4, sector 2
[ 9707.209785] end_request: I/O error, dev dm-4, sector 3
[ 9707.209791] end_request: I/O error, dev dm-4, sector 4
[ 9707.209797] end_request: I/O error, dev dm-4, sector 5
[ 9707.209802] end_request: I/O error, dev dm-4, sector 6
[ 9707.209808] end_request: I/O error, dev dm-4, sector 7
[ 9707.209820] end_request: I/O error, dev dm-4, sector 93674282880
[ 9707.209826] end_request: I/O error, dev dm-4, sector 93674282881
[ 9707.209832] end_request: I/O error, dev dm-4, sector 93674282882
[ 9707.209838] end_request: I/O error, dev dm-4, sector 93674282883
[ 9707.209843] end_request: I/O error, dev dm-4, sector 93674282884
[ 9707.209849] end_request: I/O error, dev dm-4, sector 93674282885
[ 9707.209855] end_request: I/O error, dev dm-4, sector 93674282886
[ 9707.209861] end_request: I/O error, dev dm-4, sector 93674282887
[ 9707.209868] end_request: I/O error, dev dm-4, sector 93674282992
[ 9707.209874] end_request: I/O error, dev dm-4, sector 93674282993
[ 9707.209880] end_request: I/O error, dev dm-4, sector 93674282994
[ 9707.209886] end_request: I/O error, dev dm-4, sector 93674282995
[ 9707.209891] end_request: I/O error, dev dm-4, sector 93674282996
[ 9707.209897] end_request: I/O error, dev dm-4, sector 93674282997
[ 9707.209903] end_request: I/O error, dev dm-4, sector 93674282998
[ 9707.209908] end_request: I/O error, dev dm-4, sector 93674282999
[ 9707.209916] end_request: I/O error, dev dm-4, sector 0
[ 9707.209921] end_request: I/O error, dev dm-4, sector 1
[ 9707.209927] end_request: I/O error, dev dm-4, sector 2
[ 9707.209933] end_request: I/O error, dev dm-4, sector 3
[ 9707.209938] end_request: I/O error, dev dm-4, sector 4
[ 9707.209944] end_request: I/O error, dev dm-4, sector 5
[ 9707.209950] end_request: I/O error, dev dm-4, sector 6
[ 9707.209955] end_request: I/O error, dev dm-4, sector 7
[ 9707.209964] end_request: I/O error, dev dm-4, sector 8
[ 9707.209970] end_request: I/O error, dev dm-4, sector 9
[ 9707.209976] end_request: I/O error, dev dm-4, sector 10
[ 9707.209981] end_request: I/O error, dev dm-4, sector 11
[ 9707.209987] end_request: I/O error, dev dm-4, sector 12
[ 9707.209993] end_request: I/O error, dev dm-4, sector 13
[ 9707.209998] end_request: I/O error, dev dm-4, sector 14
[ 9707.210004] end_request: I/O error, dev dm-4, sector 15
[ 9707.210014] end_request: I/O error, dev dm-4, sector 93674283000
[ 9707.210020] end_request: I/O error, dev dm-4, sector 93674283001
[ 9707.210026] end_request: I/O error, dev dm-4, sector 93674283002
[ 9707.210031] end_request: I/O error, dev dm-4, sector 93674283003
[ 9707.210037] end_request: I/O error, dev dm-4, sector 93674283004
[ 9707.210043] end_request: I/O error, dev dm-4, sector 93674283005
[ 9707.210048] end_request: I/O error, dev dm-4, sector 93674283006
[ 9707.210054] end_request: I/O error, dev dm-4, sector 93674283007
[ 9707.210061] end_request: I/O error, dev dm-4, sector 93674283000
[ 9707.210067] end_request: I/O error, dev dm-4, sector 93674283001
[ 9707.210073] end_request: I/O error, dev dm-4, sector 93674283002
[ 9707.210079] end_request: I/O error, dev dm-4, sector 93674283003
[ 9707.210084] end_request: I/O error, dev dm-4, sector 93674283004
[ 9707.210090] end_request: I/O error, dev dm-4, sector 93674283005
[ 9707.210096] end_request: I/O error, dev dm-4, sector 93674283006
[ 9707.210101] end_request: I/O error, dev dm-4, sector 93674283007
[ 9707.210108] end_request: I/O error, dev dm-4, sector 93674283000
[ 9707.210114] end_request: I/O error, dev dm-4, sector 93674283001
[ 9707.210120] end_request: I/O error, dev dm-4, sector 93674283002
[ 9707.210126] end_request: I/O error, dev dm-4, sector 93674283003
[ 9707.210132] end_request: I/O error, dev dm-4, sector 93674283004
[ 9707.210137] end_request: I/O error, dev dm-4, sector 93674283005
[ 9707.210143] end_request: I/O error, dev dm-4, sector 93674283006
[ 9707.210149] end_request: I/O error, dev dm-4, sector 93674283007
[ 9707.210156] end_request: I/O error, dev dm-4, sector 93674283000
[ 9707.210162] end_request: I/O error, dev dm-4, sector 93674283001
[ 9707.210167] end_request: I/O error, dev dm-4, sector 93674283002
[ 9707.210173] end_request: I/O error, dev dm-4, sector 93674283003
[ 9707.210179] end_request: I/O error, dev dm-4, sector 93674283004
[ 9707.210185] end_request: I/O error, dev dm-4, sector 93674283005
[ 9707.210190] end_request: I/O error, dev dm-4, sector 93674283006
[ 9707.210196] end_request: I/O error, dev dm-4, sector 93674283007
[ 9707.210204] end_request: I/O error, dev dm-4, sector 93674283000
[ 9707.210210] end_request: I/O error, dev dm-4, sector 93674283001
[ 9707.210215] end_request: I/O error, dev dm-4, sector 93674283002
[ 9707.210221] end_request: I/O error, dev dm-4, sector 93674283003
[ 9707.210227] end_request: I/O error, dev dm-4, sector 93674283004
[ 9707.210233] end_request: I/O error, dev dm-4, sector 93674283005
[ 9707.210239] end_request: I/O error, dev dm-4, sector 93674283006
[ 9707.210244] end_request: I/O error, dev dm-4, sector 93674283007
[ 9707.210252] end_request: I/O error, dev dm-4, sector 93674283000
[ 9707.210258] end_request: I/O error, dev dm-4, sector 93674283001
[ 9707.210264] end_request: I/O error, dev dm-4, sector 93674283002
[ 9707.210270] end_request: I/O error, dev dm-4, sector 93674283003
[ 9707.210276] end_request: I/O error, dev dm-4, sector 93674283004
[ 9707.210281] end_request: I/O error, dev dm-4, sector 93674283005
[ 9707.210287] end_request: I/O error, dev dm-4, sector 93674283006
[ 9707.210293] end_request: I/O error, dev dm-4, sector 93674283007
[ 9707.210300] end_request: I/O error, dev dm-4, sector 93674282944
[ 9707.210306] end_request: I/O error, dev dm-4, sector 93674282945
[ 9707.210312] end_request: I/O error, dev dm-4, sector 93674282946
[ 9707.210317] end_request: I/O error, dev dm-4, sector 93674282947
[ 9707.210323] end_request: I/O error, dev dm-4, sector 93674282948
[ 9707.210329] end_request: I/O error, dev dm-4, sector 93674282949
[ 9707.210334] end_request: I/O error, dev dm-4, sector 93674282950
[ 9707.210340] end_request: I/O error, dev dm-4, sector 93674282951
[ 9707.210348] end_request: I/O error, dev dm-4, sector 93674282992
[ 9707.210354] end_request: I/O error, dev dm-4, sector 93674282993
[ 9707.210360] end_request: I/O error, dev dm-4, sector 93674282994
[ 9707.210366] end_request: I/O error, dev dm-4, sector 93674282995
[ 9707.210372] end_request: I/O error, dev dm-4, sector 93674282996
[ 9707.210377] end_request: I/O error, dev dm-4, sector 93674282997
[ 9707.210383] end_request: I/O error, dev dm-4, sector 93674282998
[ 9707.210389] end_request: I/O error, dev dm-4, sector 93674282999
[ 9707.210396] end_request: I/O error, dev dm-4, sector 8
[ 9707.210402] end_request: I/O error, dev dm-4, sector 9
[ 9707.210408] end_request: I/O error, dev dm-4, sector 10
[ 9707.210414] end_request: I/O error, dev dm-4, sector 11
[ 9707.210419] end_request: I/O error, dev dm-4, sector 12
[ 9707.210425] end_request: I/O error, dev dm-4, sector 13
[ 9707.210430] end_request: I/O error, dev dm-4, sector 14
[ 9707.210436] end_request: I/O error, dev dm-4, sector 15
[ 9707.210444] end_request: I/O error, dev dm-4, sector 8
[ 9707.210450] end_request: I/O error, dev dm-4, sector 9
[ 9707.210455] end_request: I/O error, dev dm-4, sector 10
[ 9707.210461] end_request: I/O error, dev dm-4, sector 11
[ 9707.210466] end_request: I/O error, dev dm-4, sector 12
[ 9707.210472] end_request: I/O error, dev dm-4, sector 13
[ 9707.210478] end_request: I/O error, dev dm-4, sector 14
[ 9707.210483] end_request: I/O error, dev dm-4, sector 15
[ 9707.210491] end_request: I/O error, dev dm-4, sector 93674283000
[ 9707.210497] end_request: I/O error, dev dm-4, sector 93674283001
[ 9707.210503] end_request: I/O error, dev dm-4, sector 93674283002
[ 9707.210509] end_request: I/O error, dev dm-4, sector 93674283003
[ 9707.210519] end_request: I/O error, dev dm-4, sector 93674283004
[ 9707.210531] end_request: I/O error, dev dm-4, sector 93674283005
[ 9707.210537] end_request: I/O error, dev dm-4, sector 93674283006
[ 9707.210543] end_request: I/O error, dev dm-4, sector 93674283007
[ 9707.210551] end_request: I/O error, dev dm-4, sector 93674283000
[ 9707.210557] end_request: I/O error, dev dm-4, sector 93674283001
[ 9707.210563] end_request: I/O error, dev dm-4, sector 93674283002
[ 9707.210568] end_request: I/O error, dev dm-4, sector 93674283003
[ 9707.210574] end_request: I/O error, dev dm-4, sector 93674283004
[ 9707.210580] end_request: I/O error, dev dm-4, sector 93674283005
[ 9707.210586] end_request: I/O error, dev dm-4, sector 93674283006
[ 9707.210591] end_request: I/O error, dev dm-4, sector 93674283007
[ 9707.210603] end_request: I/O error, dev dm-4, sector 93674283000
[ 9707.210609] end_request: I/O error, dev dm-4, sector 93674283001
[ 9707.210615] end_request: I/O error, dev dm-4, sector 93674283002
[ 9707.210620] end_request: I/O error, dev dm-4, sector 93674283003
[ 9707.210626] end_request: I/O error, dev dm-4, sector 93674283004
[ 9707.210632] end_request: I/O error, dev dm-4, sector 93674283005
[ 9707.210638] end_request: I/O error, dev dm-4, sector 93674283006
[ 9707.210643] end_request: I/O error, dev dm-4, sector 93674283007
[ 9707.210651] end_request: I/O error, dev dm-4, sector 0
[ 9707.210657] end_request: I/O error, dev dm-4, sector 1
[ 9707.210662] end_request: I/O error, dev dm-4, sector 2
[ 9707.210668] end_request: I/O error, dev dm-4, sector 3
[ 9707.210674] end_request: I/O error, dev dm-4, sector 4
[ 9707.210679] end_request: I/O error, dev dm-4, sector 5
[ 9707.210685] end_request: I/O error, dev dm-4, sector 6
[ 9707.210690] end_request: I/O error, dev dm-4, sector 7
[ 9707.210698] end_request: I/O error, dev dm-4, sector 0
[ 9707.210703] end_request: I/O error, dev dm-4, sector 1
[ 9707.210709] end_request: I/O error, dev dm-4, sector 2
[ 9707.210715] end_request: I/O error, dev dm-4, sector 3
[ 9707.210721] end_request: I/O error, dev dm-4, sector 4
[ 9707.210726] end_request: I/O error, dev dm-4, sector 5
[ 9707.210732] end_request: I/O error, dev dm-4, sector 6
[ 9707.210738] end_request: I/O error, dev dm-4, sector 7
[ 9707.210745] end_request: I/O error, dev dm-4, sector 0
[ 9707.210751] end_request: I/O error, dev dm-4, sector 1
[ 9707.210756] end_request: I/O error, dev dm-4, sector 2
[ 9707.210762] end_request: I/O error, dev dm-4, sector 3
[ 9707.210768] end_request: I/O error, dev dm-4, sector 4
[ 9707.210773] end_request: I/O error, dev dm-4, sector 5
[ 9707.210779] end_request: I/O error, dev dm-4, sector 6
[ 9707.210784] end_request: I/O error, dev dm-4, sector 7
[ 9707.210792] end_request: I/O error, dev dm-4, sector 0
[ 9707.210798] end_request: I/O error, dev dm-4, sector 1
[ 9707.210803] end_request: I/O error, dev dm-4, sector 2
[ 9707.210809] end_request: I/O error, dev dm-4, sector 3
[ 9707.210815] end_request: I/O error, dev dm-4, sector 4
[ 9707.210820] end_request: I/O error, dev dm-4, sector 5
[ 9707.210826] end_request: I/O error, dev dm-4, sector 6
[ 9707.210831] end_request: I/O error, dev dm-4, sector 7
[ 9707.210839] end_request: I/O error, dev dm-4, sector 0
[ 9707.210845] end_request: I/O error, dev dm-4, sector 1
[ 9707.210850] end_request: I/O error, dev dm-4, sector 2
[ 9707.210856] end_request: I/O error, dev dm-4, sector 3
[ 9707.210861] end_request: I/O error, dev dm-4, sector 4
[ 9707.210867] end_request: I/O error, dev dm-4, sector 5
[ 9707.210873] end_request: I/O error, dev dm-4, sector 6
[ 9707.210878] end_request: I/O error, dev dm-4, sector 7
[ 9707.210886] end_request: I/O error, dev dm-4, sector 0
[ 9707.210891] end_request: I/O error, dev dm-4, sector 1
[ 9707.210897] end_request: I/O error, dev dm-4, sector 2
[ 9707.210903] end_request: I/O error, dev dm-4, sector 3
[ 9707.210908] end_request: I/O error, dev dm-4, sector 4
[ 9707.210914] end_request: I/O error, dev dm-4, sector 5
[ 9707.210919] end_request: I/O error, dev dm-4, sector 6
[ 9707.210925] end_request: I/O error, dev dm-4, sector 7
[ 9707.210933] end_request: I/O error, dev dm-4, sector 0
[ 9707.210938] end_request: I/O error, dev dm-4, sector 1
[ 9707.210944] end_request: I/O error, dev dm-4, sector 2
[ 9707.210950] end_request: I/O error, dev dm-4, sector 3
[ 9707.210955] end_request: I/O error, dev dm-4, sector 4
[ 9707.210961] end_request: I/O error, dev dm-4, sector 5
[ 9707.210967] end_request: I/O error, dev dm-4, sector 6
[ 9707.210972] end_request: I/O error, dev dm-4, sector 7
[ 9707.210980] end_request: I/O error, dev dm-4, sector 2048
[ 9707.210986] end_request: I/O error, dev dm-4, sector 2049
[ 9707.210992] end_request: I/O error, dev dm-4, sector 2050
[ 9707.210997] end_request: I/O error, dev dm-4, sector 2051
[ 9707.211003] end_request: I/O error, dev dm-4, sector 2052
[ 9707.211009] end_request: I/O error, dev dm-4, sector 2053
[ 9707.211014] end_request: I/O error, dev dm-4, sector 2054
[ 9707.211020] end_request: I/O error, dev dm-4, sector 2055
[ 9707.211028] end_request: I/O error, dev dm-4, sector 0
[ 9707.211033] end_request: I/O error, dev dm-4, sector 1
[ 9707.211039] end_request: I/O error, dev dm-4, sector 2
[ 9707.211045] end_request: I/O error, dev dm-4, sector 3
[ 9707.211050] end_request: I/O error, dev dm-4, sector 4
[ 9707.211056] end_request: I/O error, dev dm-4, sector 5
[ 9707.211062] end_request: I/O error, dev dm-4, sector 6
[ 9707.211067] end_request: I/O error, dev dm-4, sector 7
[ 9707.211075] end_request: I/O error, dev dm-4, sector 0
[ 9707.211080] end_request: I/O error, dev dm-4, sector 1
[ 9707.211086] end_request: I/O error, dev dm-4, sector 2
[ 9707.211092] end_request: I/O error, dev dm-4, sector 3
[ 9707.211097] end_request: I/O error, dev dm-4, sector 4
[ 9707.211103] end_request: I/O error, dev dm-4, sector 5
[ 9707.211109] end_request: I/O error, dev dm-4, sector 6
[ 9707.211114] end_request: I/O error, dev dm-4, sector 7
[ 9707.211122] end_request: I/O error, dev dm-4, sector 0
[ 9707.211127] end_request: I/O error, dev dm-4, sector 1
[ 9707.211133] end_request: I/O error, dev dm-4, sector 2
[ 9707.211139] end_request: I/O error, dev dm-4, sector 3
[ 9707.211144] end_request: I/O error, dev dm-4, sector 4
[ 9707.211150] end_request: I/O error, dev dm-4, sector 5
[ 9707.211155] end_request: I/O error, dev dm-4, sector 6
[ 9707.211161] end_request: I/O error, dev dm-4, sector 7
[ 9707.211168] end_request: I/O error, dev dm-4, sector 0
[ 9707.211174] end_request: I/O error, dev dm-4, sector 1
[ 9707.211180] end_request: I/O error, dev dm-4, sector 2
[ 9707.211185] end_request: I/O error, dev dm-4, sector 3
[ 9707.211191] end_request: I/O error, dev dm-4, sector 4
[ 9707.211197] end_request: I/O error, dev dm-4, sector 5
[ 9707.211202] end_request: I/O error, dev dm-4, sector 6
[ 9707.211208] end_request: I/O error, dev dm-4, sector 7
[ 9707.211215] end_request: I/O error, dev dm-4, sector 0
[ 9707.211221] end_request: I/O error, dev dm-4, sector 1
[ 9707.211227] end_request: I/O error, dev dm-4, sector 2
[ 9707.211232] end_request: I/O error, dev dm-4, sector 3
[ 9707.211238] end_request: I/O error, dev dm-4, sector 4
[ 9707.211243] end_request: I/O error, dev dm-4, sector 5
[ 9707.211249] end_request: I/O error, dev dm-4, sector 6
[ 9707.211255] end_request: I/O error, dev dm-4, sector 7
[ 9707.211262] end_request: I/O error, dev dm-4, sector 0
[ 9707.211268] end_request: I/O error, dev dm-4, sector 1
[ 9707.211274] end_request: I/O error, dev dm-4, sector 2
[ 9707.211279] end_request: I/O error, dev dm-4, sector 3
[ 9707.211285] end_request: I/O error, dev dm-4, sector 4
[ 9707.211290] end_request: I/O error, dev dm-4, sector 5
[ 9707.211296] end_request: I/O error, dev dm-4, sector 6
[ 9707.211302] end_request: I/O error, dev dm-4, sector 7
[ 9707.211309] end_request: I/O error, dev dm-4, sector 0
[ 9707.211315] end_request: I/O error, dev dm-4, sector 1
[ 9707.211321] end_request: I/O error, dev dm-4, sector 2
[ 9707.211326] end_request: I/O error, dev dm-4, sector 3
[ 9707.211332] end_request: I/O error, dev dm-4, sector 4
[ 9707.211337] end_request: I/O error, dev dm-4, sector 5
[ 9707.211343] end_request: I/O error, dev dm-4, sector 6
[ 9707.211349] end_request: I/O error, dev dm-4, sector 7
[ 9707.211356] end_request: I/O error, dev dm-4, sector 0
[ 9707.211362] end_request: I/O error, dev dm-4, sector 1
[ 9707.211367] end_request: I/O error, dev dm-4, sector 2
[ 9707.211373] end_request: I/O error, dev dm-4, sector 3
[ 9707.211379] end_request: I/O error, dev dm-4, sector 4
[ 9707.211384] end_request: I/O error, dev dm-4, sector 5
[ 9707.211390] end_request: I/O error, dev dm-4, sector 6
[ 9707.211395] end_request: I/O error, dev dm-4, sector 7
[ 9707.211403] end_request: I/O error, dev dm-4, sector 0
[ 9707.211409] end_request: I/O error, dev dm-4, sector 1
[ 9707.211414] end_request: I/O error, dev dm-4, sector 2
[ 9707.211420] end_request: I/O error, dev dm-4, sector 3
[ 9707.211426] end_request: I/O error, dev dm-4, sector 4
[ 9707.211431] end_request: I/O error, dev dm-4, sector 5
[ 9707.211437] end_request: I/O error, dev dm-4, sector 6
[ 9707.211443] end_request: I/O error, dev dm-4, sector 7
[ 9707.211450] end_request: I/O error, dev dm-4, sector 0
[ 9707.211456] end_request: I/O error, dev dm-4, sector 1
[ 9707.211461] end_request: I/O error, dev dm-4, sector 2
[ 9707.211467] end_request: I/O error, dev dm-4, sector 3
[ 9707.211473] end_request: I/O error, dev dm-4, sector 4
[ 9707.211478] end_request: I/O error, dev dm-4, sector 5
[ 9707.211484] end_request: I/O error, dev dm-4, sector 6
[ 9707.211489] end_request: I/O error, dev dm-4, sector 7
[ 9707.211497] end_request: I/O error, dev dm-4, sector 0
[ 9707.211502] end_request: I/O error, dev dm-4, sector 1
[ 9707.211508] end_request: I/O error, dev dm-4, sector 2
[ 9707.211519] end_request: I/O error, dev dm-4, sector 3
[ 9707.211527] end_request: I/O error, dev dm-4, sector 4
[ 9707.211533] end_request: I/O error, dev dm-4, sector 5
[ 9707.211539] end_request: I/O error, dev dm-4, sector 6
[ 9707.211545] end_request: I/O error, dev dm-4, sector 7
[ 9707.211553] end_request: I/O error, dev dm-4, sector 0
[ 9707.211559] end_request: I/O error, dev dm-4, sector 1
[ 9707.211564] end_request: I/O error, dev dm-4, sector 2
[ 9707.211570] end_request: I/O error, dev dm-4, sector 3
[ 9707.211576] end_request: I/O error, dev dm-4, sector 4
[ 9707.211581] end_request: I/O error, dev dm-4, sector 5
[ 9707.211587] end_request: I/O error, dev dm-4, sector 6
[ 9707.211593] end_request: I/O error, dev dm-4, sector 7
[ 9707.211600] end_request: I/O error, dev dm-4, sector 8
[ 9707.211606] end_request: I/O error, dev dm-4, sector 9
[ 9707.211612] end_request: I/O error, dev dm-4, sector 10
[ 9707.211617] end_request: I/O error, dev dm-4, sector 11
[ 9707.211623] end_request: I/O error, dev dm-4, sector 12
[ 9707.211628] end_request: I/O error, dev dm-4, sector 13
[ 9707.211634] end_request: I/O error, dev dm-4, sector 14
[ 9707.211640] end_request: I/O error, dev dm-4, sector 15
[ 9707.211647] end_request: I/O error, dev dm-4, sector 8
[ 9707.211653] end_request: I/O error, dev dm-4, sector 9
[ 9707.211659] end_request: I/O error, dev dm-4, sector 10
[ 9707.211664] end_request: I/O error, dev dm-4, sector 11
[ 9707.211670] end_request: I/O error, dev dm-4, sector 12
[ 9707.211676] end_request: I/O error, dev dm-4, sector 13
[ 9707.211681] end_request: I/O error, dev dm-4, sector 14
[ 9707.211687] end_request: I/O error, dev dm-4, sector 15
[ 9707.211694] end_request: I/O error, dev dm-4, sector 8
[ 9707.211700] end_request: I/O error, dev dm-4, sector 9
[ 9707.211706] end_request: I/O error, dev dm-4, sector 10
[ 9707.211712] end_request: I/O error, dev dm-4, sector 11
[ 9707.211717] end_request: I/O error, dev dm-4, sector 12
[ 9707.211723] end_request: I/O error, dev dm-4, sector 13
[ 9707.211729] end_request: I/O error, dev dm-4, sector 14
[ 9707.211734] end_request: I/O error, dev dm-4, sector 15
[ 9707.211742] end_request: I/O error, dev dm-4, sector 8
[ 9707.211747] end_request: I/O error, dev dm-4, sector 9
[ 9707.211753] end_request: I/O error, dev dm-4, sector 10
[ 9707.211759] end_request: I/O error, dev dm-4, sector 11
[ 9707.211764] end_request: I/O error, dev dm-4, sector 12
[ 9707.211770] end_request: I/O error, dev dm-4, sector 13
[ 9707.211776] end_request: I/O error, dev dm-4, sector 14
[ 9707.211781] end_request: I/O error, dev dm-4, sector 15
[ 9707.211789] end_request: I/O error, dev dm-4, sector 24
[ 9707.211795] end_request: I/O error, dev dm-4, sector 25
[ 9707.211801] end_request: I/O error, dev dm-4, sector 26
[ 9707.211806] end_request: I/O error, dev dm-4, sector 27
[ 9707.211812] end_request: I/O error, dev dm-4, sector 28
[ 9707.211818] end_request: I/O error, dev dm-4, sector 29
[ 9707.211823] end_request: I/O error, dev dm-4, sector 30
[ 9707.211829] end_request: I/O error, dev dm-4, sector 31
[ 9707.211836] end_request: I/O error, dev dm-4, sector 24
[ 9707.211842] end_request: I/O error, dev dm-4, sector 25
[ 9707.211848] end_request: I/O error, dev dm-4, sector 26
[ 9707.211854] end_request: I/O error, dev dm-4, sector 27
[ 9707.211859] end_request: I/O error, dev dm-4, sector 28
[ 9707.211865] end_request: I/O error, dev dm-4, sector 29
[ 9707.211871] end_request: I/O error, dev dm-4, sector 30
[ 9707.211876] end_request: I/O error, dev dm-4, sector 31
[ 9707.211883] end_request: I/O error, dev dm-4, sector 24
[ 9707.211889] end_request: I/O error, dev dm-4, sector 25
[ 9707.211895] end_request: I/O error, dev dm-4, sector 26
[ 9707.211901] end_request: I/O error, dev dm-4, sector 27
[ 9707.211906] end_request: I/O error, dev dm-4, sector 28
[ 9707.211912] end_request: I/O error, dev dm-4, sector 29
[ 9707.211918] end_request: I/O error, dev dm-4, sector 30
[ 9707.211923] end_request: I/O error, dev dm-4, sector 31
[ 9707.211931] end_request: I/O error, dev dm-4, sector 24
[ 9707.211936] end_request: I/O error, dev dm-4, sector 25
[ 9707.211942] end_request: I/O error, dev dm-4, sector 26
[ 9707.211948] end_request: I/O error, dev dm-4, sector 27
[ 9707.211953] end_request: I/O error, dev dm-4, sector 28
[ 9707.211959] end_request: I/O error, dev dm-4, sector 29
[ 9707.211964] end_request: I/O error, dev dm-4, sector 30
[ 9707.211970] end_request: I/O error, dev dm-4, sector 31
[ 9707.211977] end_request: I/O error, dev dm-4, sector 56
[ 9707.211983] end_request: I/O error, dev dm-4, sector 57
[ 9707.211989] end_request: I/O error, dev dm-4, sector 58
[ 9707.211995] end_request: I/O error, dev dm-4, sector 59
[ 9707.212000] end_request: I/O error, dev dm-4, sector 60
[ 9707.212006] end_request: I/O error, dev dm-4, sector 61
[ 9707.212011] end_request: I/O error, dev dm-4, sector 62
[ 9707.212017] end_request: I/O error, dev dm-4, sector 63
[ 9707.212024] end_request: I/O error, dev dm-4, sector 56
[ 9707.212030] end_request: I/O error, dev dm-4, sector 57
[ 9707.212036] end_request: I/O error, dev dm-4, sector 58
[ 9707.212041] end_request: I/O error, dev dm-4, sector 59
[ 9707.212047] end_request: I/O error, dev dm-4, sector 60
[ 9707.212053] end_request: I/O error, dev dm-4, sector 61
[ 9707.212058] end_request: I/O error, dev dm-4, sector 62
[ 9707.212064] end_request: I/O error, dev dm-4, sector 63
[ 9707.212071] end_request: I/O error, dev dm-4, sector 56
[ 9707.212077] end_request: I/O error, dev dm-4, sector 57
[ 9707.212083] end_request: I/O error, dev dm-4, sector 58
[ 9707.212089] end_request: I/O error, dev dm-4, sector 59
[ 9707.212094] end_request: I/O error, dev dm-4, sector 60
[ 9707.212100] end_request: I/O error, dev dm-4, sector 61
[ 9707.212106] end_request: I/O error, dev dm-4, sector 62
[ 9707.212111] end_request: I/O error, dev dm-4, sector 63
[ 9707.212119] end_request: I/O error, dev dm-4, sector 56
[ 9707.212125] end_request: I/O error, dev dm-4, sector 57
[ 9707.212130] end_request: I/O error, dev dm-4, sector 58
[ 9707.212136] end_request: I/O error, dev dm-4, sector 59
[ 9707.212142] end_request: I/O error, dev dm-4, sector 60
[ 9707.212147] end_request: I/O error, dev dm-4, sector 61
[ 9707.212153] end_request: I/O error, dev dm-4, sector 62
[ 9707.212159] end_request: I/O error, dev dm-4, sector 63
[ 9707.212166] end_request: I/O error, dev dm-4, sector 120
[ 9707.212172] end_request: I/O error, dev dm-4, sector 121
[ 9707.212178] end_request: I/O error, dev dm-4, sector 122
[ 9707.212184] end_request: I/O error, dev dm-4, sector 123
[ 9707.212189] end_request: I/O error, dev dm-4, sector 124
[ 9707.212195] end_request: I/O error, dev dm-4, sector 125
[ 9707.212201] end_request: I/O error, dev dm-4, sector 126
[ 9707.212206] end_request: I/O error, dev dm-4, sector 127
[ 9707.212214] end_request: I/O error, dev dm-4, sector 120
[ 9707.212220] end_request: I/O error, dev dm-4, sector 121
[ 9707.212225] end_request: I/O error, dev dm-4, sector 122
[ 9707.212231] end_request: I/O error, dev dm-4, sector 123
[ 9707.212237] end_request: I/O error, dev dm-4, sector 124
[ 9707.212242] end_request: I/O error, dev dm-4, sector 125
[ 9707.212248] end_request: I/O error, dev dm-4, sector 126
[ 9707.212253] end_request: I/O error, dev dm-4, sector 127
[ 9707.212261] end_request: I/O error, dev dm-4, sector 120
[ 9707.212266] end_request: I/O error, dev dm-4, sector 121
[ 9707.212272] end_request: I/O error, dev dm-4, sector 122
[ 9707.212278] end_request: I/O error, dev dm-4, sector 123
[ 9707.212283] end_request: I/O error, dev dm-4, sector 124
[ 9707.212289] end_request: I/O error, dev dm-4, sector 125
[ 9707.212295] end_request: I/O error, dev dm-4, sector 126
[ 9707.212300] end_request: I/O error, dev dm-4, sector 127
[ 9707.212308] end_request: I/O error, dev dm-4, sector 120
[ 9707.212314] end_request: I/O error, dev dm-4, sector 121
[ 9707.212319] end_request: I/O error, dev dm-4, sector 122
[ 9707.212325] end_request: I/O error, dev dm-4, sector 123
[ 9707.212330] end_request: I/O error, dev dm-4, sector 124
[ 9707.212336] end_request: I/O error, dev dm-4, sector 125
[ 9707.212342] end_request: I/O error, dev dm-4, sector 126
[ 9707.212347] end_request: I/O error, dev dm-4, sector 127
[ 9707.212355] end_request: I/O error, dev dm-4, sector 0
[ 9707.212361] end_request: I/O error, dev dm-4, sector 1
[ 9707.212367] end_request: I/O error, dev dm-4, sector 2
[ 9707.212372] end_request: I/O error, dev dm-4, sector 3
[ 9707.212378] end_request: I/O error, dev dm-4, sector 4
[ 9707.212383] end_request: I/O error, dev dm-4, sector 5
[ 9707.212389] end_request: I/O error, dev dm-4, sector 6
[ 9707.212395] end_request: I/O error, dev dm-4, sector 7
[ 9707.212402] end_request: I/O error, dev dm-4, sector 0
[ 9707.212408] end_request: I/O error, dev dm-4, sector 1
[ 9707.212414] end_request: I/O error, dev dm-4, sector 2
[ 9707.212419] end_request: I/O error, dev dm-4, sector 3
[ 9707.212425] end_request: I/O error, dev dm-4, sector 4
[ 9707.212431] end_request: I/O error, dev dm-4, sector 5
[ 9707.212436] end_request: I/O error, dev dm-4, sector 6
[ 9707.212442] end_request: I/O error, dev dm-4, sector 7
[ 9707.212449] end_request: I/O error, dev dm-4, sector 8
[ 9707.212455] end_request: I/O error, dev dm-4, sector 9
[ 9707.212461] end_request: I/O error, dev dm-4, sector 10
[ 9707.212466] end_request: I/O error, dev dm-4, sector 11
[ 9707.212472] end_request: I/O error, dev dm-4, sector 12
[ 9707.212478] end_request: I/O error, dev dm-4, sector 13
[ 9707.212483] end_request: I/O error, dev dm-4, sector 14
[ 9707.212489] end_request: I/O error, dev dm-4, sector 15
[ 9707.212496] end_request: I/O error, dev dm-4, sector 8
[ 9707.212502] end_request: I/O error, dev dm-4, sector 9
[ 9707.212507] end_request: I/O error, dev dm-4, sector 10
[ 9707.212521] end_request: I/O error, dev dm-4, sector 11
[ 9707.212527] end_request: I/O error, dev dm-4, sector 12
[ 9707.212534] end_request: I/O error, dev dm-4, sector 13
[ 9707.212541] end_request: I/O error, dev dm-4, sector 14
[ 9707.212547] end_request: I/O error, dev dm-4, sector 15
[ 9707.212555] end_request: I/O error, dev dm-4, sector 24
[ 9707.212561] end_request: I/O error, dev dm-4, sector 25
[ 9707.212566] end_request: I/O error, dev dm-4, sector 26
[ 9707.212572] end_request: I/O error, dev dm-4, sector 27
[ 9707.212578] end_request: I/O error, dev dm-4, sector 28
[ 9707.212583] end_request: I/O error, dev dm-4, sector 29
[ 9707.212589] end_request: I/O error, dev dm-4, sector 30
[ 9707.212594] end_request: I/O error, dev dm-4, sector 31
[ 9707.212602] end_request: I/O error, dev dm-4, sector 24
[ 9707.212608] end_request: I/O error, dev dm-4, sector 25
[ 9707.212614] end_request: I/O error, dev dm-4, sector 26
[ 9707.212619] end_request: I/O error, dev dm-4, sector 27
[ 9707.212625] end_request: I/O error, dev dm-4, sector 28
[ 9707.212630] end_request: I/O error, dev dm-4, sector 29
[ 9707.212636] end_request: I/O error, dev dm-4, sector 30
[ 9707.212642] end_request: I/O error, dev dm-4, sector 31
[ 9707.212649] end_request: I/O error, dev dm-4, sector 56
[ 9707.212655] end_request: I/O error, dev dm-4, sector 57
[ 9707.212661] end_request: I/O error, dev dm-4, sector 58
[ 9707.212666] end_request: I/O error, dev dm-4, sector 59
[ 9707.212672] end_request: I/O error, dev dm-4, sector 60
[ 9707.212677] end_request: I/O error, dev dm-4, sector 61
[ 9707.212683] end_request: I/O error, dev dm-4, sector 62
[ 9707.212689] end_request: I/O error, dev dm-4, sector 63
[ 9707.212696] end_request: I/O error, dev dm-4, sector 56
[ 9707.212702] end_request: I/O error, dev dm-4, sector 57
[ 9707.212708] end_request: I/O error, dev dm-4, sector 58
[ 9707.212713] end_request: I/O error, dev dm-4, sector 59
[ 9707.212719] end_request: I/O error, dev dm-4, sector 60
[ 9707.212724] end_request: I/O error, dev dm-4, sector 61
[ 9707.212730] end_request: I/O error, dev dm-4, sector 62
[ 9707.212735] end_request: I/O error, dev dm-4, sector 63
[ 9707.212743] end_request: I/O error, dev dm-4, sector 120
[ 9707.212749] end_request: I/O error, dev dm-4, sector 121
[ 9707.212754] end_request: I/O error, dev dm-4, sector 122
[ 9707.212760] end_request: I/O error, dev dm-4, sector 123
[ 9707.212766] end_request: I/O error, dev dm-4, sector 124
[ 9707.212771] end_request: I/O error, dev dm-4, sector 125
[ 9707.212777] end_request: I/O error, dev dm-4, sector 126
[ 9707.212782] end_request: I/O error, dev dm-4, sector 127
[ 9707.212790] end_request: I/O error, dev dm-4, sector 120
[ 9707.212796] end_request: I/O error, dev dm-4, sector 121
[ 9707.212801] end_request: I/O error, dev dm-4, sector 122
[ 9707.212807] end_request: I/O error, dev dm-4, sector 123
[ 9707.212813] end_request: I/O error, dev dm-4, sector 124
[ 9707.212818] end_request: I/O error, dev dm-4, sector 125
[ 9707.212824] end_request: I/O error, dev dm-4, sector 126
[ 9707.212829] end_request: I/O error, dev dm-4, sector 127
[ 9707.212837] end_request: I/O error, dev dm-4, sector 0
[ 9707.212843] end_request: I/O error, dev dm-4, sector 1
[ 9707.212848] end_request: I/O error, dev dm-4, sector 2
[ 9707.212854] end_request: I/O error, dev dm-4, sector 3
[ 9707.212859] end_request: I/O error, dev dm-4, sector 4
[ 9707.212865] end_request: I/O error, dev dm-4, sector 5
[ 9707.212871] end_request: I/O error, dev dm-4, sector 6
[ 9707.212876] end_request: I/O error, dev dm-4, sector 7
[ 9707.212884] end_request: I/O error, dev dm-4, sector 0
[ 9707.212889] end_request: I/O error, dev dm-4, sector 1
[ 9707.212895] end_request: I/O error, dev dm-4, sector 2
[ 9707.212901] end_request: I/O error, dev dm-4, sector 3
[ 9707.212906] end_request: I/O error, dev dm-4, sector 4
[ 9707.212912] end_request: I/O error, dev dm-4, sector 5
[ 9707.212918] end_request: I/O error, dev dm-4, sector 6
[ 9707.212923] end_request: I/O error, dev dm-4, sector 7
[ 9707.212931] end_request: I/O error, dev dm-4, sector 0
[ 9707.212937] end_request: I/O error, dev dm-4, sector 1
[ 9707.212942] end_request: I/O error, dev dm-4, sector 2
[ 9707.212948] end_request: I/O error, dev dm-4, sector 3
[ 9707.212954] end_request: I/O error, dev dm-4, sector 4
[ 9707.212959] end_request: I/O error, dev dm-4, sector 5
[ 9707.212965] end_request: I/O error, dev dm-4, sector 6
[ 9707.212970] end_request: I/O error, dev dm-4, sector 7
[ 9707.212977] end_request: I/O error, dev dm-4, sector 0
[ 9707.212983] end_request: I/O error, dev dm-4, sector 1
[ 9707.212989] end_request: I/O error, dev dm-4, sector 2
[ 9707.212995] end_request: I/O error, dev dm-4, sector 3
[ 9707.213000] end_request: I/O error, dev dm-4, sector 4
[ 9707.213006] end_request: I/O error, dev dm-4, sector 5
[ 9707.213011] end_request: I/O error, dev dm-4, sector 6
[ 9707.213017] end_request: I/O error, dev dm-4, sector 7
[ 9707.213024] end_request: I/O error, dev dm-4, sector 0
[ 9707.213030] end_request: I/O error, dev dm-4, sector 1
[ 9707.213036] end_request: I/O error, dev dm-4, sector 2
[ 9707.213042] end_request: I/O error, dev dm-4, sector 3
[ 9707.213047] end_request: I/O error, dev dm-4, sector 4
[ 9707.213053] end_request: I/O error, dev dm-4, sector 5
[ 9707.213058] end_request: I/O error, dev dm-4, sector 6
[ 9707.213064] end_request: I/O error, dev dm-4, sector 7
[ 9707.213071] end_request: I/O error, dev dm-4, sector 0
[ 9707.213077] end_request: I/O error, dev dm-4, sector 1
[ 9707.213083] end_request: I/O error, dev dm-4, sector 2
[ 9707.213089] end_request: I/O error, dev dm-4, sector 3
[ 9707.213094] end_request: I/O error, dev dm-4, sector 4
[ 9707.213100] end_request: I/O error, dev dm-4, sector 5
[ 9707.213105] end_request: I/O error, dev dm-4, sector 6
[ 9707.213111] end_request: I/O error, dev dm-4, sector 7
[ 9707.213119] end_request: I/O error, dev dm-4, sector 16
[ 9707.213124] end_request: I/O error, dev dm-4, sector 17
[ 9707.213130] end_request: I/O error, dev dm-4, sector 18
[ 9707.213136] end_request: I/O error, dev dm-4, sector 19
[ 9707.213142] end_request: I/O error, dev dm-4, sector 20
[ 9707.213147] end_request: I/O error, dev dm-4, sector 21
[ 9707.213153] end_request: I/O error, dev dm-4, sector 22
[ 9707.213158] end_request: I/O error, dev dm-4, sector 23
[ 9707.213166] end_request: I/O error, dev dm-4, sector 128
[ 9707.213172] end_request: I/O error, dev dm-4, sector 129
[ 9707.213177] end_request: I/O error, dev dm-4, sector 130
[ 9707.213183] end_request: I/O error, dev dm-4, sector 131
[ 9707.213189] end_request: I/O error, dev dm-4, sector 132
[ 9707.213194] end_request: I/O error, dev dm-4, sector 133
[ 9707.213200] end_request: I/O error, dev dm-4, sector 134
[ 9707.213206] end_request: I/O error, dev dm-4, sector 135
[ 9707.213213] end_request: I/O error, dev dm-4, sector 128
[ 9707.213219] end_request: I/O error, dev dm-4, sector 129
[ 9707.213225] end_request: I/O error, dev dm-4, sector 130
[ 9707.213231] end_request: I/O error, dev dm-4, sector 131
[ 9707.213237] end_request: I/O error, dev dm-4, sector 132
[ 9707.213242] end_request: I/O error, dev dm-4, sector 133
[ 9707.213248] end_request: I/O error, dev dm-4, sector 134
[ 9707.213254] end_request: I/O error, dev dm-4, sector 135
[ 9707.213262] end_request: I/O error, dev dm-4, sector 128
[ 9707.213269] end_request: I/O error, dev dm-4, sector 129
[ 9707.213274] end_request: I/O error, dev dm-4, sector 130
[ 9707.213280] end_request: I/O error, dev dm-4, sector 131
[ 9707.213286] end_request: I/O error, dev dm-4, sector 132
[ 9707.213291] end_request: I/O error, dev dm-4, sector 133
[ 9707.213297] end_request: I/O error, dev dm-4, sector 134
[ 9707.213302] end_request: I/O error, dev dm-4, sector 135
[ 9707.213310] end_request: I/O error, dev dm-4, sector 16
[ 9707.213316] end_request: I/O error, dev dm-4, sector 17
[ 9707.213321] end_request: I/O error, dev dm-4, sector 18
[ 9707.213327] end_request: I/O error, dev dm-4, sector 19
[ 9707.213333] end_request: I/O error, dev dm-4, sector 20
[ 9707.213338] end_request: I/O error, dev dm-4, sector 21
[ 9707.213344] end_request: I/O error, dev dm-4, sector 22
[ 9707.213349] end_request: I/O error, dev dm-4, sector 23
[ 9707.213357] end_request: I/O error, dev dm-4, sector 128
[ 9707.213363] end_request: I/O error, dev dm-4, sector 129
[ 9707.213368] end_request: I/O error, dev dm-4, sector 130
[ 9707.213374] end_request: I/O error, dev dm-4, sector 131
[ 9707.213380] end_request: I/O error, dev dm-4, sector 132
[ 9707.213385] end_request: I/O error, dev dm-4, sector 133
[ 9707.213391] end_request: I/O error, dev dm-4, sector 134
[ 9707.213396] end_request: I/O error, dev dm-4, sector 135
[ 9707.213404] end_request: I/O error, dev dm-4, sector 64
[ 9707.213410] end_request: I/O error, dev dm-4, sector 65
[ 9707.213416] end_request: I/O error, dev dm-4, sector 66
[ 9707.213422] end_request: I/O error, dev dm-4, sector 67
[ 9707.213427] end_request: I/O error, dev dm-4, sector 68
[ 9707.213433] end_request: I/O error, dev dm-4, sector 69
[ 9707.213438] end_request: I/O error, dev dm-4, sector 70
[ 9707.213444] end_request: I/O error, dev dm-4, sector 71
[ 9707.213452] end_request: I/O error, dev dm-4, sector 64
[ 9707.213457] end_request: I/O error, dev dm-4, sector 65
[ 9707.213463] end_request: I/O error, dev dm-4, sector 66
[ 9707.213469] end_request: I/O error, dev dm-4, sector 67
[ 9707.213474] end_request: I/O error, dev dm-4, sector 68
[ 9707.213480] end_request: I/O error, dev dm-4, sector 69
[ 9707.213485] end_request: I/O error, dev dm-4, sector 70
[ 9707.213491] end_request: I/O error, dev dm-4, sector 71
[ 9707.213499] end_request: I/O error, dev dm-4, sector 64
[ 9707.213505] end_request: I/O error, dev dm-4, sector 65
[ 9707.213518] end_request: I/O error, dev dm-4, sector 66
[ 9707.213527] end_request: I/O error, dev dm-4, sector 67
[ 9707.213533] end_request: I/O error, dev dm-4, sector 68
[ 9707.213540] end_request: I/O error, dev dm-4, sector 69
[ 9707.213545] end_request: I/O error, dev dm-4, sector 70
[ 9707.213551] end_request: I/O error, dev dm-4, sector 71
[ 9707.213559] end_request: I/O error, dev dm-4, sector 64
[ 9707.213565] end_request: I/O error, dev dm-4, sector 65
[ 9707.213571] end_request: I/O error, dev dm-4, sector 66
[ 9707.213576] end_request: I/O error, dev dm-4, sector 67
[ 9707.213582] end_request: I/O error, dev dm-4, sector 68
[ 9707.213588] end_request: I/O error, dev dm-4, sector 69
[ 9707.213593] end_request: I/O error, dev dm-4, sector 70
[ 9707.213599] end_request: I/O error, dev dm-4, sector 71
[ 9707.213606] end_request: I/O error, dev dm-4, sector 64
[ 9707.213612] end_request: I/O error, dev dm-4, sector 65
[ 9707.213618] end_request: I/O error, dev dm-4, sector 66
[ 9707.213623] end_request: I/O error, dev dm-4, sector 67
[ 9707.213629] end_request: I/O error, dev dm-4, sector 68
[ 9707.213635] end_request: I/O error, dev dm-4, sector 69
[ 9707.213640] end_request: I/O error, dev dm-4, sector 70
[ 9707.213646] end_request: I/O error, dev dm-4, sector 71
[ 9707.213653] end_request: I/O error, dev dm-4, sector 64
[ 9707.213659] end_request: I/O error, dev dm-4, sector 65
[ 9707.213665] end_request: I/O error, dev dm-4, sector 66
[ 9707.213671] end_request: I/O error, dev dm-4, sector 67
[ 9707.213677] end_request: I/O error, dev dm-4, sector 68
[ 9707.213682] end_request: I/O error, dev dm-4, sector 69
[ 9707.213688] end_request: I/O error, dev dm-4, sector 70
[ 9707.213693] end_request: I/O error, dev dm-4, sector 71
[ 9707.213701] end_request: I/O error, dev dm-4, sector 64
[ 9707.213707] end_request: I/O error, dev dm-4, sector 65
[ 9707.213712] end_request: I/O error, dev dm-4, sector 66
[ 9707.213718] end_request: I/O error, dev dm-4, sector 67
[ 9707.213724] end_request: I/O error, dev dm-4, sector 68
[ 9707.213729] end_request: I/O error, dev dm-4, sector 69
[ 9707.213735] end_request: I/O error, dev dm-4, sector 70
[ 9707.213741] end_request: I/O error, dev dm-4, sector 71
[ 9707.213748] end_request: I/O error, dev dm-4, sector 64
[ 9707.213754] end_request: I/O error, dev dm-4, sector 65
[ 9707.213760] end_request: I/O error, dev dm-4, sector 66
[ 9707.213765] end_request: I/O error, dev dm-4, sector 67
[ 9707.213771] end_request: I/O error, dev dm-4, sector 68
[ 9707.213776] end_request: I/O error, dev dm-4, sector 69
[ 9707.213782] end_request: I/O error, dev dm-4, sector 70
[ 9707.213791] end_request: I/O error, dev dm-4, sector 71
[ 9707.213798] end_request: I/O error, dev dm-4, sector 64
[ 9707.213804] end_request: I/O error, dev dm-4, sector 65
[ 9707.213810] end_request: I/O error, dev dm-4, sector 66
[ 9707.213815] end_request: I/O error, dev dm-4, sector 67
[ 9707.213821] end_request: I/O error, dev dm-4, sector 68
[ 9707.213827] end_request: I/O error, dev dm-4, sector 69
[ 9707.213832] end_request: I/O error, dev dm-4, sector 70
[ 9707.213838] end_request: I/O error, dev dm-4, sector 71
[ 9707.213845] end_request: I/O error, dev dm-4, sector 64
[ 9707.213851] end_request: I/O error, dev dm-4, sector 65
[ 9707.213857] end_request: I/O error, dev dm-4, sector 66
[ 9707.213862] end_request: I/O error, dev dm-4, sector 67
[ 9707.213868] end_request: I/O error, dev dm-4, sector 68
[ 9707.213873] end_request: I/O error, dev dm-4, sector 69
[ 9707.213879] end_request: I/O error, dev dm-4, sector 70
[ 9707.213885] end_request: I/O error, dev dm-4, sector 71
[ 9707.213892] end_request: I/O error, dev dm-4, sector 256
[ 9707.213898] end_request: I/O error, dev dm-4, sector 257
[ 9707.213904] end_request: I/O error, dev dm-4, sector 258
[ 9707.213909] end_request: I/O error, dev dm-4, sector 259
[ 9707.213915] end_request: I/O error, dev dm-4, sector 260
[ 9707.213921] end_request: I/O error, dev dm-4, sector 261
[ 9707.213926] end_request: I/O error, dev dm-4, sector 262
[ 9707.213932] end_request: I/O error, dev dm-4, sector 263
[ 9707.213939] end_request: I/O error, dev dm-4, sector 256
[ 9707.213945] end_request: I/O error, dev dm-4, sector 257
[ 9707.213951] end_request: I/O error, dev dm-4, sector 258
[ 9707.213956] end_request: I/O error, dev dm-4, sector 259
[ 9707.213962] end_request: I/O error, dev dm-4, sector 260
[ 9707.213967] end_request: I/O error, dev dm-4, sector 261
[ 9707.213973] end_request: I/O error, dev dm-4, sector 262
[ 9707.213979] end_request: I/O error, dev dm-4, sector 263
[ 9707.213987] end_request: I/O error, dev dm-4, sector 264
[ 9707.213992] end_request: I/O error, dev dm-4, sector 265
[ 9707.213998] end_request: I/O error, dev dm-4, sector 266
[ 9707.214004] end_request: I/O error, dev dm-4, sector 267
[ 9707.214009] end_request: I/O error, dev dm-4, sector 268
[ 9707.214015] end_request: I/O error, dev dm-4, sector 269
[ 9707.214020] end_request: I/O error, dev dm-4, sector 270
[ 9707.214026] end_request: I/O error, dev dm-4, sector 271
[ 9707.214033] end_request: I/O error, dev dm-4, sector 264
[ 9707.214039] end_request: I/O error, dev dm-4, sector 265
[ 9707.214045] end_request: I/O error, dev dm-4, sector 266
[ 9707.214051] end_request: I/O error, dev dm-4, sector 267
[ 9707.214056] end_request: I/O error, dev dm-4, sector 268
[ 9707.214062] end_request: I/O error, dev dm-4, sector 269
[ 9707.214068] end_request: I/O error, dev dm-4, sector 270
[ 9707.214073] end_request: I/O error, dev dm-4, sector 271
[ 9707.214081] end_request: I/O error, dev dm-4, sector 272
[ 9707.214087] end_request: I/O error, dev dm-4, sector 273
[ 9707.214092] end_request: I/O error, dev dm-4, sector 274
[ 9707.214098] end_request: I/O error, dev dm-4, sector 275
[ 9707.214104] end_request: I/O error, dev dm-4, sector 276
[ 9707.214109] end_request: I/O error, dev dm-4, sector 277
[ 9707.214115] end_request: I/O error, dev dm-4, sector 278
[ 9707.214120] end_request: I/O error, dev dm-4, sector 279
[ 9707.214128] end_request: I/O error, dev dm-4, sector 272
[ 9707.214134] end_request: I/O error, dev dm-4, sector 273
[ 9707.214139] end_request: I/O error, dev dm-4, sector 274
[ 9707.214145] end_request: I/O error, dev dm-4, sector 275
[ 9707.214150] end_request: I/O error, dev dm-4, sector 276
[ 9707.214156] end_request: I/O error, dev dm-4, sector 277
[ 9707.214162] end_request: I/O error, dev dm-4, sector 278
[ 9707.214168] end_request: I/O error, dev dm-4, sector 279
[ 9707.214175] end_request: I/O error, dev dm-4, sector 768
[ 9707.214181] end_request: I/O error, dev dm-4, sector 769
[ 9707.214187] end_request: I/O error, dev dm-4, sector 770
[ 9707.214193] end_request: I/O error, dev dm-4, sector 771
[ 9707.214198] end_request: I/O error, dev dm-4, sector 772
[ 9707.214204] end_request: I/O error, dev dm-4, sector 773
[ 9707.214209] end_request: I/O error, dev dm-4, sector 774
[ 9707.214215] end_request: I/O error, dev dm-4, sector 775
[ 9707.214223] end_request: I/O error, dev dm-4, sector 768
[ 9707.214228] end_request: I/O error, dev dm-4, sector 769
[ 9707.214234] end_request: I/O error, dev dm-4, sector 770
[ 9707.214240] end_request: I/O error, dev dm-4, sector 771
[ 9707.214245] end_request: I/O error, dev dm-4, sector 772
[ 9707.214251] end_request: I/O error, dev dm-4, sector 773
[ 9707.214257] end_request: I/O error, dev dm-4, sector 774
[ 9707.214262] end_request: I/O error, dev dm-4, sector 775
[ 9707.214270] end_request: I/O error, dev dm-4, sector 776
[ 9707.214276] end_request: I/O error, dev dm-4, sector 777
[ 9707.214282] end_request: I/O error, dev dm-4, sector 778
[ 9707.214287] end_request: I/O error, dev dm-4, sector 779
[ 9707.214293] end_request: I/O error, dev dm-4, sector 780
[ 9707.214299] end_request: I/O error, dev dm-4, sector 781
[ 9707.214304] end_request: I/O error, dev dm-4, sector 782
[ 9707.214310] end_request: I/O error, dev dm-4, sector 783
[ 9707.214317] end_request: I/O error, dev dm-4, sector 776
[ 9707.214323] end_request: I/O error, dev dm-4, sector 777
[ 9707.214329] end_request: I/O error, dev dm-4, sector 778
[ 9707.214334] end_request: I/O error, dev dm-4, sector 779
[ 9707.214340] end_request: I/O error, dev dm-4, sector 780
[ 9707.214345] end_request: I/O error, dev dm-4, sector 781
[ 9707.214351] end_request: I/O error, dev dm-4, sector 782
[ 9707.214357] end_request: I/O error, dev dm-4, sector 783
[ 9707.214364] end_request: I/O error, dev dm-4, sector 784
[ 9707.214370] end_request: I/O error, dev dm-4, sector 785
[ 9707.214376] end_request: I/O error, dev dm-4, sector 786
[ 9707.214382] end_request: I/O error, dev dm-4, sector 787
[ 9707.214387] end_request: I/O error, dev dm-4, sector 788
[ 9707.214393] end_request: I/O error, dev dm-4, sector 789
[ 9707.214398] end_request: I/O error, dev dm-4, sector 790
[ 9707.214404] end_request: I/O error, dev dm-4, sector 791
[ 9707.214411] end_request: I/O error, dev dm-4, sector 784
[ 9707.214417] end_request: I/O error, dev dm-4, sector 785
[ 9707.214423] end_request: I/O error, dev dm-4, sector 786
[ 9707.214428] end_request: I/O error, dev dm-4, sector 787
[ 9707.214434] end_request: I/O error, dev dm-4, sector 788
[ 9707.214439] end_request: I/O error, dev dm-4, sector 789
[ 9707.214445] end_request: I/O error, dev dm-4, sector 790
[ 9707.214451] end_request: I/O error, dev dm-4, sector 791
[ 9707.214458] end_request: I/O error, dev dm-4, sector 0
[ 9707.214464] end_request: I/O error, dev dm-4, sector 1
[ 9707.214470] end_request: I/O error, dev dm-4, sector 2
[ 9707.214475] end_request: I/O error, dev dm-4, sector 3
[ 9707.214481] end_request: I/O error, dev dm-4, sector 4
[ 9707.214486] end_request: I/O error, dev dm-4, sector 5
[ 9707.214492] end_request: I/O error, dev dm-4, sector 6
[ 9707.214498] end_request: I/O error, dev dm-4, sector 7
[ 9707.214505] end_request: I/O error, dev dm-4, sector 0
[ 9707.214518] end_request: I/O error, dev dm-4, sector 1
[ 9707.214524] end_request: I/O error, dev dm-4, sector 2
[ 9707.214529] end_request: I/O error, dev dm-4, sector 3
[ 9707.214535] end_request: I/O error, dev dm-4, sector 4
[ 9707.214541] end_request: I/O error, dev dm-4, sector 5
[ 9707.214546] end_request: I/O error, dev dm-4, sector 6
[ 9707.214552] end_request: I/O error, dev dm-4, sector 7
[ 9707.214560] end_request: I/O error, dev dm-4, sector 0
[ 9707.214566] end_request: I/O error, dev dm-4, sector 1
[ 9707.214572] end_request: I/O error, dev dm-4, sector 2
[ 9707.214577] end_request: I/O error, dev dm-4, sector 3
[ 9707.214583] end_request: I/O error, dev dm-4, sector 4
[ 9707.214589] end_request: I/O error, dev dm-4, sector 5
[ 9707.214594] end_request: I/O error, dev dm-4, sector 6
[ 9707.214600] end_request: I/O error, dev dm-4, sector 7
[ 9707.214608] end_request: I/O error, dev dm-4, sector 0
[ 9707.214614] end_request: I/O error, dev dm-4, sector 1
[ 9707.214619] end_request: I/O error, dev dm-4, sector 2
[ 9707.214625] end_request: I/O error, dev dm-4, sector 3
[ 9707.214630] end_request: I/O error, dev dm-4, sector 4
[ 9707.214636] end_request: I/O error, dev dm-4, sector 5
[ 9707.214642] end_request: I/O error, dev dm-4, sector 6
[ 9707.214647] end_request: I/O error, dev dm-4, sector 7
[ 9707.214655] end_request: I/O error, dev dm-4, sector 0
[ 9707.214661] end_request: I/O error, dev dm-4, sector 1
[ 9707.214667] end_request: I/O error, dev dm-4, sector 2
[ 9707.214672] end_request: I/O error, dev dm-4, sector 3
[ 9707.214678] end_request: I/O error, dev dm-4, sector 4
[ 9707.214684] end_request: I/O error, dev dm-4, sector 5
[ 9707.214689] end_request: I/O error, dev dm-4, sector 6
[ 9707.214695] end_request: I/O error, dev dm-4, sector 7
[ 9707.214703] end_request: I/O error, dev dm-4, sector 16
[ 9707.214709] end_request: I/O error, dev dm-4, sector 17
[ 9707.214714] end_request: I/O error, dev dm-4, sector 18
[ 9707.214720] end_request: I/O error, dev dm-4, sector 19
[ 9707.214726] end_request: I/O error, dev dm-4, sector 20
[ 9707.214731] end_request: I/O error, dev dm-4, sector 21
[ 9707.214737] end_request: I/O error, dev dm-4, sector 22
[ 9707.214743] end_request: I/O error, dev dm-4, sector 23
[ 9707.214750] end_request: I/O error, dev dm-4, sector 0
[ 9707.214756] end_request: I/O error, dev dm-4, sector 1
[ 9707.214762] end_request: I/O error, dev dm-4, sector 2
[ 9707.214767] end_request: I/O error, dev dm-4, sector 3
[ 9707.214773] end_request: I/O error, dev dm-4, sector 4
[ 9707.214779] end_request: I/O error, dev dm-4, sector 5
[ 9707.214784] end_request: I/O error, dev dm-4, sector 6
[ 9707.214790] end_request: I/O error, dev dm-4, sector 7
[ 9707.214797] end_request: I/O error, dev dm-4, sector 0
[ 9707.214803] end_request: I/O error, dev dm-4, sector 1
[ 9707.214809] end_request: I/O error, dev dm-4, sector 2
[ 9707.214815] end_request: I/O error, dev dm-4, sector 3
[ 9707.214820] end_request: I/O error, dev dm-4, sector 4
[ 9707.214826] end_request: I/O error, dev dm-4, sector 5
[ 9707.214831] end_request: I/O error, dev dm-4, sector 6
[ 9707.214837] end_request: I/O error, dev dm-4, sector 7
[ 9707.214845] end_request: I/O error, dev dm-4, sector 0
[ 9707.214851] end_request: I/O error, dev dm-4, sector 1
[ 9707.214856] end_request: I/O error, dev dm-4, sector 2
[ 9707.214862] end_request: I/O error, dev dm-4, sector 3
[ 9707.214867] end_request: I/O error, dev dm-4, sector 4
[ 9707.214873] end_request: I/O error, dev dm-4, sector 5
[ 9707.214879] end_request: I/O error, dev dm-4, sector 6
[ 9707.214884] end_request: I/O error, dev dm-4, sector 7
[ 9707.214892] end_request: I/O error, dev dm-4, sector 0
[ 9707.214898] end_request: I/O error, dev dm-4, sector 1
[ 9707.214903] end_request: I/O error, dev dm-4, sector 2
[ 9707.214909] end_request: I/O error, dev dm-4, sector 3
[ 9707.214915] end_request: I/O error, dev dm-4, sector 4
[ 9707.214920] end_request: I/O error, dev dm-4, sector 5
[ 9707.214926] end_request: I/O error, dev dm-4, sector 6
[ 9707.214931] end_request: I/O error, dev dm-4, sector 7
[ 9707.214939] end_request: I/O error, dev dm-4, sector 0
[ 9707.214945] end_request: I/O error, dev dm-4, sector 1
[ 9707.214950] end_request: I/O error, dev dm-4, sector 2
[ 9707.214956] end_request: I/O error, dev dm-4, sector 3
[ 9707.214962] end_request: I/O error, dev dm-4, sector 4
[ 9707.214967] end_request: I/O error, dev dm-4, sector 5
[ 9707.214973] end_request: I/O error, dev dm-4, sector 6
[ 9707.214979] end_request: I/O error, dev dm-4, sector 7
[ 9707.214986] end_request: I/O error, dev dm-4, sector 0
[ 9707.214992] end_request: I/O error, dev dm-4, sector 1
[ 9707.214998] end_request: I/O error, dev dm-4, sector 2
[ 9707.215003] end_request: I/O error, dev dm-4, sector 3
[ 9707.215009] end_request: I/O error, dev dm-4, sector 4
[ 9707.215015] end_request: I/O error, dev dm-4, sector 5
[ 9707.215023] end_request: I/O error, dev dm-4, sector 6
[ 9707.215029] end_request: I/O error, dev dm-4, sector 7
[ 9707.215036] end_request: I/O error, dev dm-4, sector 0
[ 9707.215042] end_request: I/O error, dev dm-4, sector 1
[ 9707.215047] end_request: I/O error, dev dm-4, sector 2
[ 9707.215053] end_request: I/O error, dev dm-4, sector 3
[ 9707.215059] end_request: I/O error, dev dm-4, sector 4
[ 9707.215064] end_request: I/O error, dev dm-4, sector 5
[ 9707.215070] end_request: I/O error, dev dm-4, sector 6
[ 9707.215075] end_request: I/O error, dev dm-4, sector 7
[ 9707.215083] end_request: I/O error, dev dm-4, sector 0
[ 9707.215089] end_request: I/O error, dev dm-4, sector 1
[ 9707.215094] end_request: I/O error, dev dm-4, sector 2
[ 9707.215100] end_request: I/O error, dev dm-4, sector 3
[ 9707.215106] end_request: I/O error, dev dm-4, sector 4
[ 9707.215111] end_request: I/O error, dev dm-4, sector 5
[ 9707.215117] end_request: I/O error, dev dm-4, sector 6
[ 9707.215122] end_request: I/O error, dev dm-4, sector 7
[ 9707.215130] end_request: I/O error, dev dm-4, sector 0
[ 9707.215136] end_request: I/O error, dev dm-4, sector 1
[ 9707.215141] end_request: I/O error, dev dm-4, sector 2
[ 9707.215147] end_request: I/O error, dev dm-4, sector 3
[ 9707.215153] end_request: I/O error, dev dm-4, sector 4
[ 9707.215158] end_request: I/O error, dev dm-4, sector 5
[ 9707.215164] end_request: I/O error, dev dm-4, sector 6
[ 9707.215169] end_request: I/O error, dev dm-4, sector 7
[ 9707.215177] end_request: I/O error, dev dm-4, sector 0
[ 9707.215183] end_request: I/O error, dev dm-4, sector 1
[ 9707.215188] end_request: I/O error, dev dm-4, sector 2
[ 9707.215194] end_request: I/O error, dev dm-4, sector 3
[ 9707.215199] end_request: I/O error, dev dm-4, sector 4
[ 9707.215205] end_request: I/O error, dev dm-4, sector 5
[ 9707.215211] end_request: I/O error, dev dm-4, sector 6
[ 9707.215216] end_request: I/O error, dev dm-4, sector 7
[ 9707.215223] end_request: I/O error, dev dm-4, sector 0
[ 9707.215229] end_request: I/O error, dev dm-4, sector 1
[ 9707.215235] end_request: I/O error, dev dm-4, sector 2
[ 9707.215241] end_request: I/O error, dev dm-4, sector 3
[ 9707.215246] end_request: I/O error, dev dm-4, sector 4
[ 9707.215252] end_request: I/O error, dev dm-4, sector 5
[ 9707.215258] end_request: I/O error, dev dm-4, sector 6
[ 9707.215263] end_request: I/O error, dev dm-4, sector 7
[ 9707.215271] end_request: I/O error, dev dm-4, sector 0
[ 9707.215277] end_request: I/O error, dev dm-4, sector 1
[ 9707.215282] end_request: I/O error, dev dm-4, sector 2
[ 9707.215288] end_request: I/O error, dev dm-4, sector 3
[ 9707.215293] end_request: I/O error, dev dm-4, sector 4
[ 9707.215299] end_request: I/O error, dev dm-4, sector 5
[ 9707.215305] end_request: I/O error, dev dm-4, sector 6
[ 9707.215310] end_request: I/O error, dev dm-4, sector 7
[ 9707.215318] end_request: I/O error, dev dm-4, sector 128
[ 9707.215324] end_request: I/O error, dev dm-4, sector 129
[ 9707.215329] end_request: I/O error, dev dm-4, sector 130
[ 9707.215335] end_request: I/O error, dev dm-4, sector 131
[ 9707.215341] end_request: I/O error, dev dm-4, sector 132
[ 9707.215346] end_request: I/O error, dev dm-4, sector 133
[ 9707.215352] end_request: I/O error, dev dm-4, sector 134
[ 9707.215358] end_request: I/O error, dev dm-4, sector 135
[ 9707.215365] end_request: I/O error, dev dm-4, sector 128
[ 9707.215371] end_request: I/O error, dev dm-4, sector 129
[ 9707.215377] end_request: I/O error, dev dm-4, sector 130
[ 9707.215383] end_request: I/O error, dev dm-4, sector 131
[ 9707.215388] end_request: I/O error, dev dm-4, sector 132
[ 9707.215394] end_request: I/O error, dev dm-4, sector 133
[ 9707.215399] end_request: I/O error, dev dm-4, sector 134
[ 9707.215405] end_request: I/O error, dev dm-4, sector 135
[ 9707.215413] end_request: I/O error, dev dm-4, sector 16
[ 9707.215418] end_request: I/O error, dev dm-4, sector 17
[ 9707.215424] end_request: I/O error, dev dm-4, sector 18
[ 9707.215430] end_request: I/O error, dev dm-4, sector 19
[ 9707.215435] end_request: I/O error, dev dm-4, sector 20
[ 9707.215441] end_request: I/O error, dev dm-4, sector 21
[ 9707.215447] end_request: I/O error, dev dm-4, sector 22
[ 9707.215452] end_request: I/O error, dev dm-4, sector 23
[ 9707.215460] end_request: I/O error, dev dm-4, sector 0
[ 9707.215465] end_request: I/O error, dev dm-4, sector 1
[ 9707.215471] end_request: I/O error, dev dm-4, sector 2
[ 9707.215477] end_request: I/O error, dev dm-4, sector 3
[ 9707.215482] end_request: I/O error, dev dm-4, sector 4
[ 9707.215488] end_request: I/O error, dev dm-4, sector 5
[ 9707.215494] end_request: I/O error, dev dm-4, sector 6
[ 9707.215499] end_request: I/O error, dev dm-4, sector 7
[ 9707.215514] end_request: I/O error, dev dm-4, sector 0
[ 9707.215520] end_request: I/O error, dev dm-4, sector 1
[ 9707.215526] end_request: I/O error, dev dm-4, sector 2
[ 9707.215532] end_request: I/O error, dev dm-4, sector 3
[ 9707.215537] end_request: I/O error, dev dm-4, sector 4
[ 9707.215543] end_request: I/O error, dev dm-4, sector 5
[ 9707.215549] end_request: I/O error, dev dm-4, sector 6
[ 9707.215554] end_request: I/O error, dev dm-4, sector 7
[ 9707.215562] end_request: I/O error, dev dm-4, sector 8
[ 9707.215568] end_request: I/O error, dev dm-4, sector 9
[ 9707.215573] end_request: I/O error, dev dm-4, sector 10
[ 9707.215579] end_request: I/O error, dev dm-4, sector 11
[ 9707.215585] end_request: I/O error, dev dm-4, sector 12
[ 9707.215590] end_request: I/O error, dev dm-4, sector 13
[ 9707.215596] end_request: I/O error, dev dm-4, sector 14
[ 9707.215601] end_request: I/O error, dev dm-4, sector 15
[ 9707.215609] end_request: I/O error, dev dm-4, sector 16
[ 9707.215615] end_request: I/O error, dev dm-4, sector 17
[ 9707.215620] end_request: I/O error, dev dm-4, sector 18
[ 9707.215626] end_request: I/O error, dev dm-4, sector 19
[ 9707.215632] end_request: I/O error, dev dm-4, sector 20
[ 9707.215637] end_request: I/O error, dev dm-4, sector 21
[ 9707.215643] end_request: I/O error, dev dm-4, sector 22
[ 9707.215649] end_request: I/O error, dev dm-4, sector 23
[ 9707.215656] end_request: I/O error, dev dm-4, sector 0
[ 9707.215662] end_request: I/O error, dev dm-4, sector 1
[ 9707.215668] end_request: I/O error, dev dm-4, sector 2
[ 9707.215673] end_request: I/O error, dev dm-4, sector 3
[ 9707.215679] end_request: I/O error, dev dm-4, sector 4
[ 9707.215685] end_request: I/O error, dev dm-4, sector 5
[ 9707.215690] end_request: I/O error, dev dm-4, sector 6
[ 9707.215696] end_request: I/O error, dev dm-4, sector 7
[ 9707.215704] end_request: I/O error, dev dm-4, sector 0
[ 9707.215709] end_request: I/O error, dev dm-4, sector 1
[ 9707.215715] end_request: I/O error, dev dm-4, sector 2
[ 9707.215721] end_request: I/O error, dev dm-4, sector 3
[ 9707.215726] end_request: I/O error, dev dm-4, sector 4
[ 9707.215732] end_request: I/O error, dev dm-4, sector 5
[ 9707.215738] end_request: I/O error, dev dm-4, sector 6
[ 9707.215743] end_request: I/O error, dev dm-4, sector 7
[ 9707.215751] end_request: I/O error, dev dm-4, sector 0
[ 9707.215757] end_request: I/O error, dev dm-4, sector 1
[ 9707.215762] end_request: I/O error, dev dm-4, sector 2
[ 9707.215768] end_request: I/O error, dev dm-4, sector 3
[ 9707.215774] end_request: I/O error, dev dm-4, sector 4
[ 9707.215779] end_request: I/O error, dev dm-4, sector 5
[ 9707.215785] end_request: I/O error, dev dm-4, sector 6
[ 9707.215791] end_request: I/O error, dev dm-4, sector 7
[ 9707.215798] end_request: I/O error, dev dm-4, sector 0
[ 9707.215804] end_request: I/O error, dev dm-4, sector 1
[ 9707.215810] end_request: I/O error, dev dm-4, sector 2
[ 9707.215815] end_request: I/O error, dev dm-4, sector 3
[ 9707.215821] end_request: I/O error, dev dm-4, sector 4
[ 9707.215826] end_request: I/O error, dev dm-4, sector 5
[ 9707.215832] end_request: I/O error, dev dm-4, sector 6
[ 9707.215838] end_request: I/O error, dev dm-4, sector 7
[ 9707.215845] end_request: I/O error, dev dm-4, sector 0
[ 9707.215851] end_request: I/O error, dev dm-4, sector 1
[ 9707.215857] end_request: I/O error, dev dm-4, sector 2
[ 9707.215862] end_request: I/O error, dev dm-4, sector 3
[ 9707.215868] end_request: I/O error, dev dm-4, sector 4
[ 9707.215873] end_request: I/O error, dev dm-4, sector 5
[ 9707.215879] end_request: I/O error, dev dm-4, sector 6
[ 9707.215885] end_request: I/O error, dev dm-4, sector 7
[ 9707.215892] end_request: I/O error, dev dm-4, sector 0
[ 9707.215898] end_request: I/O error, dev dm-4, sector 1
[ 9707.215904] end_request: I/O error, dev dm-4, sector 2
[ 9707.215909] end_request: I/O error, dev dm-4, sector 3
[ 9707.215915] end_request: I/O error, dev dm-4, sector 4
[ 9707.215921] end_request: I/O error, dev dm-4, sector 5
[ 9707.215926] end_request: I/O error, dev dm-4, sector 6
[ 9707.215932] end_request: I/O error, dev dm-4, sector 7
[ 9707.215939] end_request: I/O error, dev dm-4, sector 8
[ 9707.215945] end_request: I/O error, dev dm-4, sector 9
[ 9707.215951] end_request: I/O error, dev dm-4, sector 10
[ 9707.215957] end_request: I/O error, dev dm-4, sector 11
[ 9707.215962] end_request: I/O error, dev dm-4, sector 12
[ 9707.215968] end_request: I/O error, dev dm-4, sector 13
[ 9707.215974] end_request: I/O error, dev dm-4, sector 14
[ 9707.215979] end_request: I/O error, dev dm-4, sector 15
[ 9707.215987] end_request: I/O error, dev dm-4, sector 128
[ 9707.215993] end_request: I/O error, dev dm-4, sector 129
[ 9707.215998] end_request: I/O error, dev dm-4, sector 130
[ 9707.216004] end_request: I/O error, dev dm-4, sector 131
[ 9707.216010] end_request: I/O error, dev dm-4, sector 132
[ 9707.216015] end_request: I/O error, dev dm-4, sector 133
[ 9707.216021] end_request: I/O error, dev dm-4, sector 134
[ 9707.216027] end_request: I/O error, dev dm-4, sector 135
[ 9707.216034] end_request: I/O error, dev dm-4, sector 0
[ 9707.216040] end_request: I/O error, dev dm-4, sector 1
[ 9707.216046] end_request: I/O error, dev dm-4, sector 2
[ 9707.216051] end_request: I/O error, dev dm-4, sector 3
[ 9707.216057] end_request: I/O error, dev dm-4, sector 4
[ 9707.216063] end_request: I/O error, dev dm-4, sector 5
[ 9707.216068] end_request: I/O error, dev dm-4, sector 6
[ 9707.216074] end_request: I/O error, dev dm-4, sector 7
[ 9707.216081] end_request: I/O error, dev dm-4, sector 0
[ 9707.216087] end_request: I/O error, dev dm-4, sector 1
[ 9707.216093] end_request: I/O error, dev dm-4, sector 2
[ 9707.216098] end_request: I/O error, dev dm-4, sector 3
[ 9707.216104] end_request: I/O error, dev dm-4, sector 4
[ 9707.216110] end_request: I/O error, dev dm-4, sector 5
[ 9707.216115] end_request: I/O error, dev dm-4, sector 6
[ 9707.216121] end_request: I/O error, dev dm-4, sector 7
[ 9707.216129] end_request: I/O error, dev dm-4, sector 4096
[ 9707.216135] end_request: I/O error, dev dm-4, sector 4097
[ 9707.216140] end_request: I/O error, dev dm-4, sector 4098
[ 9707.216146] end_request: I/O error, dev dm-4, sector 4099
[ 9707.216152] end_request: I/O error, dev dm-4, sector 4100
[ 9707.216157] end_request: I/O error, dev dm-4, sector 4101
[ 9707.216163] end_request: I/O error, dev dm-4, sector 4102
[ 9707.216169] end_request: I/O error, dev dm-4, sector 4103
[ 9707.311427] end_request: I/O error, dev dm-12, sector 0
[ 9707.311442] end_request: I/O error, dev dm-12, sector 1
[ 9707.311449] end_request: I/O error, dev dm-12, sector 2
[ 9707.311455] end_request: I/O error, dev dm-12, sector 3
[ 9707.311461] end_request: I/O error, dev dm-12, sector 4
[ 9707.311467] end_request: I/O error, dev dm-12, sector 5
[ 9707.311473] end_request: I/O error, dev dm-12, sector 6
[ 9707.311479] end_request: I/O error, dev dm-12, sector 7
[ 9707.311490] end_request: I/O error, dev dm-12, sector 93674283000
[ 9707.311497] end_request: I/O error, dev dm-12, sector 93674283001
[ 9707.311503] end_request: I/O error, dev dm-12, sector 93674283002
[ 9707.311509] end_request: I/O error, dev dm-12, sector 93674283003
[ 9707.311515] end_request: I/O error, dev dm-12, sector 93674283004
[ 9707.311520] end_request: I/O error, dev dm-12, sector 93674283005
[ 9707.311526] end_request: I/O error, dev dm-12, sector 93674283006
[ 9707.311532] end_request: I/O error, dev dm-12, sector 93674283007
[ 9707.311541] end_request: I/O error, dev dm-12, sector 0
[ 9707.311547] end_request: I/O error, dev dm-12, sector 1
[ 9707.311553] end_request: I/O error, dev dm-12, sector 2
[ 9707.311559] end_request: I/O error, dev dm-12, sector 3
[ 9707.311564] end_request: I/O error, dev dm-12, sector 4
[ 9707.311570] end_request: I/O error, dev dm-12, sector 5
[ 9707.311576] end_request: I/O error, dev dm-12, sector 6
[ 9707.311581] end_request: I/O error, dev dm-12, sector 7
[ 9707.311598] end_request: I/O error, dev dm-12, sector 0
[ 9707.311604] end_request: I/O error, dev dm-12, sector 1
[ 9707.311610] end_request: I/O error, dev dm-12, sector 2
[ 9707.311616] end_request: I/O error, dev dm-12, sector 3
[ 9707.311622] end_request: I/O error, dev dm-12, sector 4
[ 9707.311628] end_request: I/O error, dev dm-12, sector 5
[ 9707.311633] end_request: I/O error, dev dm-12, sector 6
[ 9707.311639] end_request: I/O error, dev dm-12, sector 7
[ 9707.311650] end_request: I/O error, dev dm-12, sector 24
[ 9707.311656] end_request: I/O error, dev dm-12, sector 25
[ 9707.311662] end_request: I/O error, dev dm-12, sector 26
[ 9707.311668] end_request: I/O error, dev dm-12, sector 27
[ 9707.311673] end_request: I/O error, dev dm-12, sector 28
[ 9707.311679] end_request: I/O error, dev dm-12, sector 29
[ 9707.311685] end_request: I/O error, dev dm-12, sector 30
[ 9707.311691] end_request: I/O error, dev dm-12, sector 31
[ 9707.327194] end_request: I/O error, dev dm-6, sector 0
[ 9707.327212] end_request: I/O error, dev dm-6, sector 1
[ 9707.327219] end_request: I/O error, dev dm-6, sector 2
[ 9707.327225] end_request: I/O error, dev dm-6, sector 3
[ 9707.327231] end_request: I/O error, dev dm-6, sector 4
[ 9707.327237] end_request: I/O error, dev dm-6, sector 5
[ 9707.327243] end_request: I/O error, dev dm-6, sector 6
[ 9707.327248] end_request: I/O error, dev dm-6, sector 7
[ 9707.327266] end_request: I/O error, dev dm-6, sector 93674283000
[ 9707.327278] end_request: I/O error, dev dm-6, sector 93674283001
[ 9707.327284] end_request: I/O error, dev dm-6, sector 93674283002
[ 9707.327290] end_request: I/O error, dev dm-6, sector 93674283003
[ 9707.327298] end_request: I/O error, dev dm-6, sector 93674283004
[ 9707.327304] end_request: I/O error, dev dm-6, sector 93674283005
[ 9707.327310] end_request: I/O error, dev dm-6, sector 93674283006
[ 9707.327318] end_request: I/O error, dev dm-6, sector 93674283007
[ 9707.327328] end_request: I/O error, dev dm-6, sector 0
[ 9707.327334] end_request: I/O error, dev dm-6, sector 1
[ 9707.327340] end_request: I/O error, dev dm-6, sector 2
[ 9707.327346] end_request: I/O error, dev dm-6, sector 3
[ 9707.327352] end_request: I/O error, dev dm-6, sector 4
[ 9707.327357] end_request: I/O error, dev dm-6, sector 5
[ 9707.327363] end_request: I/O error, dev dm-6, sector 6
[ 9707.327369] end_request: I/O error, dev dm-6, sector 7
[ 9707.327384] end_request: I/O error, dev dm-6, sector 0
[ 9707.327391] end_request: I/O error, dev dm-6, sector 1
[ 9707.327397] end_request: I/O error, dev dm-6, sector 2
[ 9707.327403] end_request: I/O error, dev dm-6, sector 3
[ 9707.327408] end_request: I/O error, dev dm-6, sector 4
[ 9707.327414] end_request: I/O error, dev dm-6, sector 5
[ 9707.327420] end_request: I/O error, dev dm-6, sector 6
[ 9707.327425] end_request: I/O error, dev dm-6, sector 7
[ 9707.327435] end_request: I/O error, dev dm-6, sector 24
[ 9707.327441] end_request: I/O error, dev dm-6, sector 25
[ 9707.327447] end_request: I/O error, dev dm-6, sector 26
[ 9707.327453] end_request: I/O error, dev dm-6, sector 27
[ 9707.327459] end_request: I/O error, dev dm-6, sector 28
[ 9707.327465] end_request: I/O error, dev dm-6, sector 29
[ 9707.327470] end_request: I/O error, dev dm-6, sector 30
[ 9707.327476] end_request: I/O error, dev dm-6, sector 31
[ 9707.328737] end_request: I/O error, dev dm-8, sector 0
[ 9707.328751] end_request: I/O error, dev dm-8, sector 1
[ 9707.328758] end_request: I/O error, dev dm-8, sector 2
[ 9707.328764] end_request: I/O error, dev dm-8, sector 3
[ 9707.328770] end_request: I/O error, dev dm-8, sector 4
[ 9707.328776] end_request: I/O error, dev dm-8, sector 5
[ 9707.328781] end_request: I/O error, dev dm-8, sector 6
[ 9707.328787] end_request: I/O error, dev dm-8, sector 7
[ 9707.328799] end_request: I/O error, dev dm-8, sector 93674283000
[ 9707.328806] end_request: I/O error, dev dm-8, sector 93674283001
[ 9707.328812] end_request: I/O error, dev dm-8, sector 93674283002
[ 9707.328818] end_request: I/O error, dev dm-8, sector 93674283003
[ 9707.328824] end_request: I/O error, dev dm-8, sector 93674283004
[ 9707.328830] end_request: I/O error, dev dm-8, sector 93674283005
[ 9707.328836] end_request: I/O error, dev dm-8, sector 93674283006
[ 9707.328842] end_request: I/O error, dev dm-8, sector 93674283007
[ 9707.328850] end_request: I/O error, dev dm-8, sector 0
[ 9707.328857] end_request: I/O error, dev dm-8, sector 1
[ 9707.328862] end_request: I/O error, dev dm-8, sector 2
[ 9707.328868] end_request: I/O error, dev dm-8, sector 3
[ 9707.328874] end_request: I/O error, dev dm-8, sector 4
[ 9707.328880] end_request: I/O error, dev dm-8, sector 5
[ 9707.328886] end_request: I/O error, dev dm-8, sector 6
[ 9707.328891] end_request: I/O error, dev dm-8, sector 7
[ 9707.328906] end_request: I/O error, dev dm-8, sector 0
[ 9707.328912] end_request: I/O error, dev dm-8, sector 1
[ 9707.328918] end_request: I/O error, dev dm-8, sector 2
[ 9707.328924] end_request: I/O error, dev dm-8, sector 3
[ 9707.328930] end_request: I/O error, dev dm-8, sector 4
[ 9707.328936] end_request: I/O error, dev dm-8, sector 5
[ 9707.328942] end_request: I/O error, dev dm-8, sector 6
[ 9707.328947] end_request: I/O error, dev dm-8, sector 7
[ 9707.328957] end_request: I/O error, dev dm-8, sector 24
[ 9707.328964] end_request: I/O error, dev dm-8, sector 25
[ 9707.328970] end_request: I/O error, dev dm-8, sector 26
[ 9707.328976] end_request: I/O error, dev dm-8, sector 27
[ 9707.328981] end_request: I/O error, dev dm-8, sector 28
[ 9707.328987] end_request: I/O error, dev dm-8, sector 29
[ 9707.328993] end_request: I/O error, dev dm-8, sector 30
[ 9707.328999] end_request: I/O error, dev dm-8, sector 31
[ 9707.331922] end_request: I/O error, dev dm-11, sector 0
[ 9707.331936] end_request: I/O error, dev dm-11, sector 1
[ 9707.331943] end_request: I/O error, dev dm-11, sector 2
[ 9707.331949] end_request: I/O error, dev dm-11, sector 3
[ 9707.331955] end_request: I/O error, dev dm-11, sector 4
[ 9707.331961] end_request: I/O error, dev dm-11, sector 5
[ 9707.331967] end_request: I/O error, dev dm-11, sector 6
[ 9707.331973] end_request: I/O error, dev dm-11, sector 7
[ 9707.331985] end_request: I/O error, dev dm-11, sector 93674283000
[ 9707.331992] end_request: I/O error, dev dm-11, sector 93674283001
[ 9707.331998] end_request: I/O error, dev dm-11, sector 93674283002
[ 9707.332004] end_request: I/O error, dev dm-11, sector 93674283003
[ 9707.332010] end_request: I/O error, dev dm-11, sector 93674283004
[ 9707.332016] end_request: I/O error, dev dm-11, sector 93674283005
[ 9707.332021] end_request: I/O error, dev dm-11, sector 93674283006
[ 9707.332027] end_request: I/O error, dev dm-11, sector 93674283007
[ 9707.332036] end_request: I/O error, dev dm-11, sector 0
[ 9707.332042] end_request: I/O error, dev dm-11, sector 1
[ 9707.332048] end_request: I/O error, dev dm-11, sector 2
[ 9707.332054] end_request: I/O error, dev dm-11, sector 3
[ 9707.332060] end_request: I/O error, dev dm-11, sector 4
[ 9707.332066] end_request: I/O error, dev dm-11, sector 5
[ 9707.332071] end_request: I/O error, dev dm-11, sector 6
[ 9707.332077] end_request: I/O error, dev dm-11, sector 7
[ 9707.332091] end_request: I/O error, dev dm-11, sector 0
[ 9707.332098] end_request: I/O error, dev dm-11, sector 1
[ 9707.332104] end_request: I/O error, dev dm-11, sector 2
[ 9707.332110] end_request: I/O error, dev dm-11, sector 3
[ 9707.332115] end_request: I/O error, dev dm-11, sector 4
[ 9707.332121] end_request: I/O error, dev dm-11, sector 5
[ 9707.332127] end_request: I/O error, dev dm-11, sector 6
[ 9707.332133] end_request: I/O error, dev dm-11, sector 7
[ 9707.332143] end_request: I/O error, dev dm-11, sector 24
[ 9707.332149] end_request: I/O error, dev dm-11, sector 25
[ 9707.332155] end_request: I/O error, dev dm-11, sector 26
[ 9707.332161] end_request: I/O error, dev dm-11, sector 27
[ 9707.332167] end_request: I/O error, dev dm-11, sector 28
[ 9707.332172] end_request: I/O error, dev dm-11, sector 29
[ 9707.332178] end_request: I/O error, dev dm-11, sector 30
[ 9707.332184] end_request: I/O error, dev dm-11, sector 31
[ 9707.339763] end_request: I/O error, dev dm-10, sector 0
[ 9707.339775] end_request: I/O error, dev dm-10, sector 1
[ 9707.339781] end_request: I/O error, dev dm-10, sector 2
[ 9707.339787] end_request: I/O error, dev dm-10, sector 3
[ 9707.339793] end_request: I/O error, dev dm-10, sector 4
[ 9707.339799] end_request: I/O error, dev dm-10, sector 5
[ 9707.339805] end_request: I/O error, dev dm-10, sector 6
[ 9707.339811] end_request: I/O error, dev dm-10, sector 7
[ 9707.339823] end_request: I/O error, dev dm-10, sector 93674283000
[ 9707.339829] end_request: I/O error, dev dm-10, sector 93674283001
[ 9707.339836] end_request: I/O error, dev dm-10, sector 93674283002
[ 9707.339845] end_request: I/O error, dev dm-10, sector 93674283003
[ 9707.339851] end_request: I/O error, dev dm-10, sector 93674283004
[ 9707.339857] end_request: I/O error, dev dm-10, sector 93674283005
[ 9707.339863] end_request: I/O error, dev dm-10, sector 93674283006
[ 9707.339869] end_request: I/O error, dev dm-10, sector 93674283007
[ 9707.339877] end_request: I/O error, dev dm-10, sector 0
[ 9707.339884] end_request: I/O error, dev dm-10, sector 1
[ 9707.339889] end_request: I/O error, dev dm-10, sector 2
[ 9707.339895] end_request: I/O error, dev dm-10, sector 3
[ 9707.339901] end_request: I/O error, dev dm-10, sector 4
[ 9707.339907] end_request: I/O error, dev dm-10, sector 5
[ 9707.339913] end_request: I/O error, dev dm-10, sector 6
[ 9707.339918] end_request: I/O error, dev dm-10, sector 7
[ 9707.339933] end_request: I/O error, dev dm-10, sector 0
[ 9707.339939] end_request: I/O error, dev dm-10, sector 1
[ 9707.339945] end_request: I/O error, dev dm-10, sector 2
[ 9707.339951] end_request: I/O error, dev dm-10, sector 3
[ 9707.339956] end_request: I/O error, dev dm-10, sector 4
[ 9707.339962] end_request: I/O error, dev dm-10, sector 5
[ 9707.339968] end_request: I/O error, dev dm-10, sector 6
[ 9707.339974] end_request: I/O error, dev dm-10, sector 7
[ 9707.339984] end_request: I/O error, dev dm-10, sector 24
[ 9707.339990] end_request: I/O error, dev dm-10, sector 25
[ 9707.339996] end_request: I/O error, dev dm-10, sector 26
[ 9707.340001] end_request: I/O error, dev dm-10, sector 27
[ 9707.340007] end_request: I/O error, dev dm-10, sector 28
[ 9707.340013] end_request: I/O error, dev dm-10, sector 29
[ 9707.340018] end_request: I/O error, dev dm-10, sector 30
[ 9707.340024] end_request: I/O error, dev dm-10, sector 31
[ 9707.348041] end_request: I/O error, dev dm-5, sector 0
[ 9707.348054] end_request: I/O error, dev dm-5, sector 1
[ 9707.348061] end_request: I/O error, dev dm-5, sector 2
[ 9707.348067] end_request: I/O error, dev dm-5, sector 3
[ 9707.348074] end_request: I/O error, dev dm-5, sector 4
[ 9707.348080] end_request: I/O error, dev dm-5, sector 5
[ 9707.348086] end_request: I/O error, dev dm-5, sector 6
[ 9707.348092] end_request: I/O error, dev dm-5, sector 7
[ 9707.348103] end_request: I/O error, dev dm-5, sector 93674283000
[ 9707.348110] end_request: I/O error, dev dm-5, sector 93674283001
[ 9707.348116] end_request: I/O error, dev dm-5, sector 93674283002
[ 9707.348122] end_request: I/O error, dev dm-5, sector 93674283003
[ 9707.348129] end_request: I/O error, dev dm-5, sector 93674283004
[ 9707.348135] end_request: I/O error, dev dm-5, sector 93674283005
[ 9707.348141] end_request: I/O error, dev dm-5, sector 93674283006
[ 9707.348147] end_request: I/O error, dev dm-5, sector 93674283007
[ 9707.348156] end_request: I/O error, dev dm-5, sector 0
[ 9707.348162] end_request: I/O error, dev dm-5, sector 1
[ 9707.348168] end_request: I/O error, dev dm-5, sector 2
[ 9707.348174] end_request: I/O error, dev dm-5, sector 3
[ 9707.348180] end_request: I/O error, dev dm-5, sector 4
[ 9707.348186] end_request: I/O error, dev dm-5, sector 5
[ 9707.348192] end_request: I/O error, dev dm-5, sector 6
[ 9707.348198] end_request: I/O error, dev dm-5, sector 7
[ 9707.348213] end_request: I/O error, dev dm-5, sector 0
[ 9707.348235] end_request: I/O error, dev dm-5, sector 1
[ 9707.348241] end_request: I/O error, dev dm-5, sector 2
[ 9707.348247] end_request: I/O error, dev dm-5, sector 3
[ 9707.348253] end_request: I/O error, dev dm-5, sector 4
[ 9707.348259] end_request: I/O error, dev dm-5, sector 5
[ 9707.348265] end_request: I/O error, dev dm-5, sector 6
[ 9707.348271] end_request: I/O error, dev dm-5, sector 7
[ 9707.348281] end_request: I/O error, dev dm-5, sector 24
[ 9707.348288] end_request: I/O error, dev dm-5, sector 25
[ 9707.348294] end_request: I/O error, dev dm-5, sector 26
[ 9707.348300] end_request: I/O error, dev dm-5, sector 27
[ 9707.348306] end_request: I/O error, dev dm-5, sector 28
[ 9707.348312] end_request: I/O error, dev dm-5, sector 29
[ 9707.348318] end_request: I/O error, dev dm-5, sector 30
[ 9707.348323] end_request: I/O error, dev dm-5, sector 31
[ 9707.349150] end_request: I/O error, dev dm-17, sector 0
[ 9707.349166] end_request: I/O error, dev dm-17, sector 1
[ 9707.349173] end_request: I/O error, dev dm-17, sector 2
[ 9707.349179] end_request: I/O error, dev dm-17, sector 3
[ 9707.349185] end_request: I/O error, dev dm-17, sector 4
[ 9707.349191] end_request: I/O error, dev dm-17, sector 5
[ 9707.349197] end_request: I/O error, dev dm-17, sector 6
[ 9707.349203] end_request: I/O error, dev dm-17, sector 7
[ 9707.349223] end_request: I/O error, dev dm-17, sector 93674283000
[ 9707.349234] end_request: I/O error, dev dm-17, sector 93674283001
[ 9707.349241] end_request: I/O error, dev dm-17, sector 93674283002
[ 9707.349247] end_request: I/O error, dev dm-17, sector 93674283003
[ 9707.349253] end_request: I/O error, dev dm-17, sector 93674283004
[ 9707.349259] end_request: I/O error, dev dm-17, sector 93674283005
[ 9707.349265] end_request: I/O error, dev dm-17, sector 93674283006
[ 9707.349271] end_request: I/O error, dev dm-17, sector 93674283007
[ 9707.349281] end_request: I/O error, dev dm-17, sector 0
[ 9707.349287] end_request: I/O error, dev dm-17, sector 1
[ 9707.349293] end_request: I/O error, dev dm-17, sector 2
[ 9707.349299] end_request: I/O error, dev dm-17, sector 3
[ 9707.349305] end_request: I/O error, dev dm-17, sector 4
[ 9707.349311] end_request: I/O error, dev dm-17, sector 5
[ 9707.349317] end_request: I/O error, dev dm-17, sector 6
[ 9707.349323] end_request: I/O error, dev dm-17, sector 7
[ 9707.349339] end_request: I/O error, dev dm-17, sector 0
[ 9707.349346] end_request: I/O error, dev dm-17, sector 1
[ 9707.349352] end_request: I/O error, dev dm-17, sector 2
[ 9707.349358] end_request: I/O error, dev dm-17, sector 3
[ 9707.349365] end_request: I/O error, dev dm-17, sector 4
[ 9707.349372] end_request: I/O error, dev dm-17, sector 5
[ 9707.349378] end_request: I/O error, dev dm-17, sector 6
[ 9707.349384] end_request: I/O error, dev dm-17, sector 7
[ 9707.349393] end_request: I/O error, dev dm-17, sector 24
[ 9707.349400] end_request: I/O error, dev dm-17, sector 25
[ 9707.349406] end_request: I/O error, dev dm-17, sector 26
[ 9707.349412] end_request: I/O error, dev dm-17, sector 27
[ 9707.349418] end_request: I/O error, dev dm-17, sector 28
[ 9707.349424] end_request: I/O error, dev dm-17, sector 29
[ 9707.349430] end_request: I/O error, dev dm-17, sector 30
[ 9707.349436] end_request: I/O error, dev dm-17, sector 31
[ 9707.392587] end_request: I/O error, dev dm-4, sector 0
[ 9707.392602] end_request: I/O error, dev dm-4, sector 1
[ 9707.392609] end_request: I/O error, dev dm-4, sector 2
[ 9707.392615] end_request: I/O error, dev dm-4, sector 3
[ 9707.392621] end_request: I/O error, dev dm-4, sector 4
[ 9707.392627] end_request: I/O error, dev dm-4, sector 5
[ 9707.392633] end_request: I/O error, dev dm-4, sector 6
[ 9707.392639] end_request: I/O error, dev dm-4, sector 7
[ 9707.392651] end_request: I/O error, dev dm-4, sector 93674283000
[ 9707.392657] end_request: I/O error, dev dm-4, sector 93674283001
[ 9707.392663] end_request: I/O error, dev dm-4, sector 93674283002
[ 9707.392669] end_request: I/O error, dev dm-4, sector 93674283003
[ 9707.392675] end_request: I/O error, dev dm-4, sector 93674283004
[ 9707.392681] end_request: I/O error, dev dm-4, sector 93674283005
[ 9707.392687] end_request: I/O error, dev dm-4, sector 93674283006
[ 9707.392694] end_request: I/O error, dev dm-4, sector 93674283007
[ 9707.392703] end_request: I/O error, dev dm-4, sector 0
[ 9707.392709] end_request: I/O error, dev dm-4, sector 1
[ 9707.392715] end_request: I/O error, dev dm-4, sector 2
[ 9707.392721] end_request: I/O error, dev dm-4, sector 3
[ 9707.392727] end_request: I/O error, dev dm-4, sector 4
[ 9707.392733] end_request: I/O error, dev dm-4, sector 5
[ 9707.392738] end_request: I/O error, dev dm-4, sector 6
[ 9707.392744] end_request: I/O error, dev dm-4, sector 7
[ 9707.392759] end_request: I/O error, dev dm-4, sector 0
[ 9707.392765] end_request: I/O error, dev dm-4, sector 1
[ 9707.392772] end_request: I/O error, dev dm-4, sector 2
[ 9707.392777] end_request: I/O error, dev dm-4, sector 3
[ 9707.392784] end_request: I/O error, dev dm-4, sector 4
[ 9707.392789] end_request: I/O error, dev dm-4, sector 5
[ 9707.392795] end_request: I/O error, dev dm-4, sector 6
[ 9707.392801] end_request: I/O error, dev dm-4, sector 7
[ 9707.392810] end_request: I/O error, dev dm-4, sector 24
[ 9707.392817] end_request: I/O error, dev dm-4, sector 25
[ 9707.392823] end_request: I/O error, dev dm-4, sector 26
[ 9707.392829] end_request: I/O error, dev dm-4, sector 27
[ 9707.392835] end_request: I/O error, dev dm-4, sector 28
[ 9707.392840] end_request: I/O error, dev dm-4, sector 29
[ 9707.392846] end_request: I/O error, dev dm-4, sector 30
[ 9707.392852] end_request: I/O error, dev dm-4, sector 31
[ 9708.483744] XFS (dm-8): xfs_log_force: error 5 returned.
[ 9722.277632] sd 10:0:0:3: attempting task abort! scmd(ffff883e32b4f0c0)
[ 9722.277635] sd 10:0:0:3: [sdak] CDB: Read(10): 28 00 00 00 00 00 00 00
08 00
[ 9722.277643] scsi target10:0:0: handle(0x0009),
sas_address(0x500c0ff1dc3d5600), phy(0)
[ 9722.277645] scsi target10:0:0:
enclosure_logical_id(0x500605b008b36010), slot(0)
[ 9722.279380] sd 10:0:0:3: task abort: SUCCESS scmd(ffff883e32b4f0c0)
[ 9722.279382] sd 10:0:0:11: attempting task abort! scmd(ffff883e4c343a80)
[ 9722.279384] sd 10:0:0:11: [sdam] CDB: Read(10): 28 00 00 00 00 00 00 00
08 00
[ 9722.279390] scsi target10:0:0: handle(0x0009),
sas_address(0x500c0ff1dc3d5600), phy(0)
[ 9722.279391] scsi target10:0:0:
enclosure_logical_id(0x500605b008b36010), slot(0)
[ 9722.281340] sd 10:0:0:11: task abort: SUCCESS scmd(ffff883e4c343a80)
[ 9722.281342] sd 10:0:0:13: attempting task abort! scmd(ffff883e1bb45dc0)
[ 9722.281344] sd 10:0:0:13: [sdao] CDB: Read(10): 28 00 00 00 00 00 00 00
08 00
[ 9722.281349] scsi target10:0:0: handle(0x0009),
sas_address(0x500c0ff1dc3d5600), phy(0)
[ 9722.281350] scsi target10:0:0:
enclosure_logical_id(0x500605b008b36010), slot(0)
[ 9722.283067] sd 10:0:0:13: task abort: SUCCESS scmd(ffff883e1bb45dc0)
[ 9722.283069] sd 10:0:0:21: attempting task abort! scmd(ffff883e468a3bc0)
[ 9722.283071] sd 10:0:0:21: [sdaq] CDB: Read(10): 28 00 00 00 00 00 00 00
08 00
[ 9722.283076] scsi target10:0:0: handle(0x0009),
sas_address(0x500c0ff1dc3d5600), phy(0)
[ 9722.283078] scsi target10:0:0:
enclosure_logical_id(0x500605b008b36010), slot(0)
[ 9722.285030] sd 10:0:0:21: task abort: SUCCESS scmd(ffff883e468a3bc0)
[ 9722.285032] sd 10:0:0:23: attempting task abort! scmd(ffff883e490e2a80)
[ 9722.285034] sd 10:0:0:23: [sdas] CDB: Read(10): 28 00 00 00 00 00 00 00
08 00
[ 9722.285039] scsi target10:0:0: handle(0x0009),
sas_address(0x500c0ff1dc3d5600), phy(0)
[ 9722.285040] scsi target10:0:0:
enclosure_logical_id(0x500605b008b36010), slot(0)
[ 9722.286727] sd 10:0:0:23: task abort: SUCCESS scmd(ffff883e490e2a80)
[ 9722.286728] sd 10:0:0:31: attempting task abort! scmd(ffff883e378f1580)
[ 9722.286730] sd 10:0:0:31: [sdau] CDB: Read(10): 28 00 00 00 00 00 00 00
08 00
[ 9722.286735] scsi target10:0:0: handle(0x0009),
sas_address(0x500c0ff1dc3d5600), phy(0)
[ 9722.286737] scsi target10:0:0:
enclosure_logical_id(0x500605b008b36010), slot(0)
[ 9722.288352] sd 10:0:0:31: task abort: SUCCESS scmd(ffff883e378f1580)
[ 9722.288353] sd 10:0:0:33: attempting task abort! scmd(ffff883e119bcb80)
[ 9722.288355] sd 10:0:0:33: [sdaw] CDB: Read(10): 28 00 00 00 00 00 00 00
08 00
[ 9722.288360] scsi target10:0:0: handle(0x0009),
sas_address(0x500c0ff1dc3d5600), phy(0)
[ 9722.288362] scsi target10:0:0:
enclosure_logical_id(0x500605b008b36010), slot(0)
[ 9722.290115] sd 10:0:0:33: task abort: SUCCESS scmd(ffff883e119bcb80)
[ 9722.290116] sd 10:0:0:1: attempting task abort! scmd(ffff883e143b85c0)
[ 9722.290118] sd 10:0:0:1: [sdai] CDB: Read(10): 28 00 00 00 00 00 00 00
08 00
[ 9722.290123] scsi target10:0:0: handle(0x0009),
sas_address(0x500c0ff1dc3d5600), phy(0)
[ 9722.290125] scsi target10:0:0:
enclosure_logical_id(0x500605b008b36010), slot(0)
[ 9722.291939] sd 10:0:0:1: task abort: SUCCESS scmd(ffff883e143b85c0)
[ 9723.171675] sd 9:0:0:1: attempting task abort! scmd(ffff883e1eeb7c80)
[ 9723.171678] sd 9:0:0:1: [sds] CDB: Read(10): 28 00 00 00 00 00 00 00 08
00
[ 9723.171685] scsi target9:0:0: handle(0x0009),
sas_address(0x500c0ff1dc3d5400), phy(0)
[ 9723.171687] scsi target9:0:0: enclosure_logical_id(0x500605b008b36000),
slot(0)
[ 9723.173492] sd 9:0:0:1: task abort: SUCCESS scmd(ffff883e1eeb7c80)
[ 9723.173494] sd 9:0:0:3: attempting task abort! scmd(ffff883e1bb45cc0)
[ 9723.173496] sd 9:0:0:3: [sdu] CDB: Read(10): 28 00 00 00 00 00 00 00 08
00
[ 9723.173501] scsi target9:0:0: handle(0x0009),
sas_address(0x500c0ff1dc3d5400), phy(0)
[ 9723.173503] scsi target9:0:0: enclosure_logical_id(0x500605b008b36000),
slot(0)
[ 9723.175302] sd 9:0:0:3: task abort: SUCCESS scmd(ffff883e1bb45cc0)
[ 9723.175304] sd 9:0:0:11: attempting task abort! scmd(ffff884062becdc0)
[ 9723.175305] sd 9:0:0:11: [sdw] CDB: Read(10): 28 00 00 00 00 00 00 00
08 00
[ 9723.175310] scsi target9:0:0: handle(0x0009),
sas_address(0x500c0ff1dc3d5400), phy(0)
[ 9723.175312] scsi target9:0:0: enclosure_logical_id(0x500605b008b36000),
slot(0)
[ 9723.177133] sd 9:0:0:11: task abort: SUCCESS scmd(ffff884062becdc0)
[ 9723.177135] sd 9:0:0:13: attempting task abort! scmd(ffff883e3669d680)
[ 9723.177136] sd 9:0:0:13: [sdy] CDB: Read(10): 28 00 00 00 00 00 00 00
08 00
[ 9723.177141] scsi target9:0:0: handle(0x0009),
sas_address(0x500c0ff1dc3d5400), phy(0)
[ 9723.177143] scsi target9:0:0: enclosure_logical_id(0x500605b008b36000),
slot(0)
[ 9723.178781] sd 9:0:0:13: task abort: SUCCESS scmd(ffff883e3669d680)
[ 9723.178783] sd 9:0:0:21: attempting task abort! scmd(ffff883e14318380)
[ 9723.178785] sd 9:0:0:21: [sdaa] CDB: Read(10): 28 00 00 00 00 00 00 00
08 00
[ 9723.178790] scsi target9:0:0: handle(0x0009),
sas_address(0x500c0ff1dc3d5400), phy(0)
[ 9723.178791] scsi target9:0:0: enclosure_logical_id(0x500605b008b36000),
slot(0)
[ 9723.180352] sd 9:0:0:21: task abort: SUCCESS scmd(ffff883e14318380)
[ 9723.180354] sd 9:0:0:23: attempting task abort! scmd(ffff883e0e980580)
[ 9723.180355] sd 9:0:0:23: [sdac] CDB: Read(10): 28 00 00 00 00 00 00 00
08 00
[ 9723.180360] scsi target9:0:0: handle(0x0009),
sas_address(0x500c0ff1dc3d5400), phy(0)
[ 9723.180362] scsi target9:0:0: enclosure_logical_id(0x500605b008b36000),
slot(0)
[ 9723.181881] sd 9:0:0:23: task abort: SUCCESS scmd(ffff883e0e980580)
[ 9723.181883] sd 9:0:0:31: attempting task abort! scmd(ffff883e4a7999c0)
[ 9723.181885] sd 9:0:0:31: [sdae] CDB: Read(10): 28 00 00 00 00 00 00 00
08 00
[ 9723.181890] scsi target9:0:0: handle(0x0009),
sas_address(0x500c0ff1dc3d5400), phy(0)
[ 9723.181891] scsi target9:0:0: enclosure_logical_id(0x500605b008b36000),
slot(0)
[ 9723.183759] sd 9:0:0:31: task abort: SUCCESS scmd(ffff883e4a7999c0)
[ 9723.183760] sd 9:0:0:33: attempting task abort! scmd(ffff883e409d14c0)
[ 9723.183762] sd 9:0:0:33: [sdag] CDB: Read(10): 28 00 00 00 00 00 00 00
08 00
[ 9723.183767] scsi target9:0:0: handle(0x0009),
sas_address(0x500c0ff1dc3d5400), phy(0)
[ 9723.183769] scsi target9:0:0: enclosure_logical_id(0x500605b008b36000),
slot(0)
[ 9723.185723] sd 9:0:0:33: task abort: SUCCESS scmd(ffff883e409d14c0)
[ 9736.965552] XFS (dm-4): xfs_log_force: error 5 returned.
[ 9736.965555] XFS (dm-6): xfs_log_force: error 5 returned.
[ 9736.965558] XFS (dm-10): xfs_log_force: error 5 returned.
[ 9737.093272] XFS (dm-5): xfs_log_force: error 5 returned.
[ 9737.093276] XFS (dm-12): xfs_log_force: error 5 returned.
[ 9737.093382] XFS (dm-11): xfs_log_force: error 5 returned.
[ 9738.498207] XFS (dm-8): xfs_log_force: error 5 returned.
[ 9766.980016] XFS (dm-4): xfs_log_force: error 5 returned.
[ 9766.980018] XFS (dm-6): xfs_log_force: error 5 returned.
[ 9766.980021] XFS (dm-10): xfs_log_force: error 5 returned.
[ 9767.107734] XFS (dm-12): xfs_log_force: error 5 returned.
[ 9767.107736] XFS (dm-5): xfs_log_force: error 5 returned.
[ 9767.107853] XFS (dm-11): xfs_log_force: error 5 returned.
[ 9768.512673] XFS (dm-8): xfs_log_force: error 5 returned.
[ 9772.093958] igb: diskmgmt2 NIC Link is Down
[ 9772.094856] ctrl_br0: port 2(diskmgmt2) entered disabled state
[ 9796.994474] XFS (dm-4): xfs_log_force: error 5 returned.
[ 9796.994477] XFS (dm-6): xfs_log_force: error 5 returned.
[ 9796.994479] XFS (dm-10): xfs_log_force: error 5 returned.
[ 9797.122198] XFS (dm-12): xfs_log_force: error 5 returned.
[ 9797.122201] XFS (dm-11): xfs_log_force: error 5 returned.
[ 9797.122203] XFS (dm-5): xfs_log_force: error 5 returned.
[ 9798.527124] XFS (dm-8): xfs_log_force: error 5 returned.
[ 9827.008938] XFS (dm-4): xfs_log_force: error 5 returned.
[ 9827.008941] XFS (dm-6): xfs_log_force: error 5 returned.
[ 9827.008943] XFS (dm-10): xfs_log_force: error 5 returned.
[ 9827.136657] XFS (dm-12): xfs_log_force: error 5 returned.
[ 9827.136660] XFS (dm-11): xfs_log_force: error 5 returned.
[ 9827.136662] XFS (dm-5): xfs_log_force: error 5 returned.
[ 9828.541590] XFS (dm-8): xfs_log_force: error 5 returned.
[ 9857.023394] XFS (dm-4): xfs_log_force: error 5 returned.
[ 9857.023397] XFS (dm-6): xfs_log_force: error 5 returned.
[ 9857.023399] XFS (dm-10): xfs_log_force: error 5 returned.
[ 9857.151117] XFS (dm-12): xfs_log_force: error 5 returned.
[ 9857.151121] XFS (dm-5): xfs_log_force: error 5 returned.
[ 9857.151134] XFS (dm-11): xfs_log_force: error 5 returned.
[ 9858.556050] XFS (dm-8): xfs_log_force: error 5 returned.
[ 9887.037857] XFS (dm-4): xfs_log_force: error 5 returned.
[ 9887.037860] XFS (dm-6): xfs_log_force: error 5 returned.
[ 9887.037861] XFS (dm-10): xfs_log_force: error 5 returned.
[ 9887.165578] XFS (dm-5): xfs_log_force: error 5 returned.
[ 9887.165580] XFS (dm-12): xfs_log_force: error 5 returned.
[ 9887.165589] XFS (dm-11): xfs_log_force: error 5 returned.
[ 9888.570521] XFS (dm-8): xfs_log_force: error 5 returned.
[ 9917.052312] XFS (dm-4): xfs_log_force: error 5 returned.
[ 9917.052315] XFS (dm-6): xfs_log_force: error 5 returned.
[ 9917.052317] XFS (dm-10): xfs_log_force: error 5 returned.
[ 9917.180029] XFS (dm-5): xfs_log_force: error 5 returned.
[ 9917.180032] XFS (dm-12): xfs_log_force: error 5 returned.
[ 9917.180054] XFS (dm-11): xfs_log_force: error 5 returned.
[ 9918.584961] XFS (dm-8): xfs_log_force: error 5 returned.
[ 9947.066762] XFS (dm-4): xfs_log_force: error 5 returned.
[ 9947.066764] XFS (dm-6): xfs_log_force: error 5 returned.
[ 9947.066766] XFS (dm-10): xfs_log_force: error 5 returned.
[ 9947.194487] XFS (dm-12): xfs_log_force: error 5 returned.
[ 9947.194489] XFS (dm-5): xfs_log_force: error 5 returned.
[ 9947.194492] XFS (dm-11): xfs_log_force: error 5 returned.
[ 9948.599419] XFS (dm-8): xfs_log_force: error 5 returned.
[ 9977.081216] XFS (dm-4): xfs_log_force: error 5 returned.
[ 9977.081219] XFS (dm-6): xfs_log_force: error 5 returned.
[ 9977.081221] XFS (dm-10): xfs_log_force: error 5 returned.
[ 9977.208935] XFS (dm-12): xfs_log_force: error 5 returned.
[ 9977.208937] XFS (dm-5): xfs_log_force: error 5 returned.
[ 9977.208949] XFS (dm-11): xfs_log_force: error 5 returned.
[ 9978.613866] XFS (dm-8): xfs_log_force: error 5 returned.
[10007.095664] XFS (dm-4): xfs_log_force: error 5 returned.
[10007.095666] XFS (dm-6): xfs_log_force: error 5 returned.
[10007.095668] XFS (dm-10): xfs_log_force: error 5 returned.
[10007.223389] XFS (dm-12): xfs_log_force: error 5 returned.
[10007.223391] XFS (dm-5): xfs_log_force: error 5 returned.
[10007.223397] XFS (dm-11): xfs_log_force: error 5 returned.
[10008.628320] XFS (dm-8): xfs_log_force: error 5 returned.
[10037.110121] XFS (dm-4): xfs_log_force: error 5 returned.
[10037.110124] XFS (dm-6): xfs_log_force: error 5 returned.
[10037.110126] XFS (dm-10): xfs_log_force: error 5 returned.
[10037.237843] XFS (dm-5): xfs_log_force: error 5 returned.
[10037.237845] XFS (dm-12): xfs_log_force: error 5 returned.
[10037.237848] XFS (dm-11): xfs_log_force: error 5 returned.
[10038.642778] XFS (dm-8): xfs_log_force: error 5 returned.
[10067.124580] XFS (dm-4): xfs_log_force: error 5 returned.
[10067.124583] XFS (dm-6): xfs_log_force: error 5 returned.
[10067.124585] XFS (dm-10): xfs_log_force: error 5 returned.
[10067.252297] XFS (dm-5): xfs_log_force: error 5 returned.
[10067.252300] XFS (dm-12): xfs_log_force: error 5 returned.
[10067.252335] XFS (dm-11): xfs_log_force: error 5 returned.
[10068.657230] XFS (dm-8): xfs_log_force: error 5 returned.
[10097.139033] XFS (dm-4): xfs_log_force: error 5 returned.
[10097.139035] XFS (dm-6): xfs_log_force: error 5 returned.
[10097.139037] XFS (dm-10): xfs_log_force: error 5 returned.
[10097.266757] XFS (dm-12): xfs_log_force: error 5 returned.
[10097.266759] XFS (dm-5): xfs_log_force: error 5 returned.
[10097.266807] XFS (dm-11): xfs_log_force: error 5 returned.
[10098.671688] XFS (dm-8): xfs_log_force: error 5 returned.
[10127.153490] XFS (dm-4): xfs_log_force: error 5 returned.
[10127.153492] XFS (dm-6): xfs_log_force: error 5 returned.
[10127.153494] XFS (dm-10): xfs_log_force: error 5 returned.
[10127.281208] XFS (dm-12): xfs_log_force: error 5 returned.
[10127.281210] XFS (dm-5): xfs_log_force: error 5 returned.
[10127.281283] XFS (dm-11): xfs_log_force: error 5 returned.
[10128.686141] XFS (dm-8): xfs_log_force: error 5 returned.
[10157.167939] XFS (dm-4): xfs_log_force: error 5 returned.
[10157.167942] XFS (dm-6): xfs_log_force: error 5 returned.
[10157.167944] XFS (dm-10): xfs_log_force: error 5 returned.
[10157.295664] XFS (dm-12): xfs_log_force: error 5 returned.
[10157.295667] XFS (dm-5): xfs_log_force: error 5 returned.
[10157.295743] XFS (dm-11): xfs_log_force: error 5 returned.
[10158.700595] XFS (dm-8): xfs_log_force: error 5 returned.
[10187.182398] XFS (dm-4): xfs_log_force: error 5 returned.
[10187.182401] XFS (dm-6): xfs_log_force: error 5 returned.
[10187.182403] XFS (dm-10): xfs_log_force: error 5 returned.
[10187.310120] XFS (dm-5): xfs_log_force: error 5 returned.
[10187.310124] XFS (dm-12): xfs_log_force: error 5 returned.
[10187.310127] XFS (dm-11): xfs_log_force: error 5 returned.
[10188.715053] XFS (dm-8): xfs_log_force: error 5 returned.
[10217.196859] XFS (dm-4): xfs_log_force: error 5 returned.
[10217.196862] XFS (dm-6): xfs_log_force: error 5 returned.
[10217.196864] XFS (dm-10): xfs_log_force: error 5 returned.
[10217.324577] XFS (dm-5): xfs_log_force: error 5 returned.
[10217.324579] XFS (dm-12): xfs_log_force: error 5 returned.
[10217.324601] XFS (dm-11): xfs_log_force: error 5 returned.
[10218.729510] XFS (dm-8): xfs_log_force: error 5 returned.
[10247.211313] XFS (dm-4): xfs_log_force: error 5 returned.
[10247.211316] XFS (dm-6): xfs_log_force: error 5 returned.
[10247.211318] XFS (dm-10): xfs_log_force: error 5 returned.
[10247.339035] XFS (dm-12): xfs_log_force: error 5 returned.
[10247.339038] XFS (dm-5): xfs_log_force: error 5 returned.
[10247.339069] XFS (dm-11): xfs_log_force: error 5 returned.
[10248.743970] XFS (dm-8): xfs_log_force: error 5 returned.
[10277.225776] XFS (dm-4): xfs_log_force: error 5 returned.
[10277.225779] XFS (dm-6): xfs_log_force: error 5 returned.
[10277.225781] XFS (dm-10): xfs_log_force: error 5 returned.
[10277.353492] XFS (dm-12): xfs_log_force: error 5 returned.
[10277.353494] XFS (dm-5): xfs_log_force: error 5 returned.
[10277.353510] XFS (dm-11): xfs_log_force: error 5 returned.
[10278.758422] XFS (dm-8): xfs_log_force: error 5 returned.
[10307.240226] XFS (dm-4): xfs_log_force: error 5 returned.
[10307.240229] XFS (dm-6): xfs_log_force: error 5 returned.
[10307.240231] XFS (dm-10): xfs_log_force: error 5 returned.
[10307.367951] XFS (dm-12): xfs_log_force: error 5 returned.
[10307.367953] XFS (dm-5): xfs_log_force: error 5 returned.
[10307.367956] XFS (dm-11): xfs_log_force: error 5 returned.
[10308.772883] XFS (dm-8): xfs_log_force: error 5 returned.
[10337.254687] XFS (dm-4): xfs_log_force: error 5 returned.
[10337.254689] XFS (dm-6): xfs_log_force: error 5 returned.
[10337.254691] XFS (dm-10): xfs_log_force: error 5 returned.
[10337.382408] XFS (dm-5): xfs_log_force: error 5 returned.
[10337.382410] XFS (dm-12): xfs_log_force: error 5 returned.
[10337.382454] XFS (dm-11): xfs_log_force: error 5 returned.
[10338.787341] XFS (dm-8): xfs_log_force: error 5 returned.
[10367.269150] XFS (dm-4): xfs_log_force: error 5 returned.
[10367.269152] XFS (dm-6): xfs_log_force: error 5 returned.
[10367.269154] XFS (dm-10): xfs_log_force: error 5 returned.
[10367.396868] XFS (dm-12): xfs_log_force: error 5 returned.
[10367.396870] XFS (dm-11): xfs_log_force: error 5 returned.
[10367.396872] XFS (dm-5): xfs_log_force: error 5 returned.
[10368.801801] XFS (dm-8): xfs_log_force: error 5 returned.
[10397.283605] XFS (dm-4): xfs_log_force: error 5 returned.
[10397.283607] XFS (dm-6): xfs_log_force: error 5 returned.
[10397.283609] XFS (dm-10): xfs_log_force: error 5 returned.
[10397.411329] XFS (dm-12): xfs_log_force: error 5 returned.
[10397.411331] XFS (dm-5): xfs_log_force: error 5 returned.
[10397.411400] XFS (dm-11): xfs_log_force: error 5 returned.
[10398.816256] XFS (dm-8): xfs_log_force: error 5 returned.
[10427.298066] XFS (dm-4): xfs_log_force: error 5 returned.
[10427.298069] XFS (dm-6): xfs_log_force: error 5 returned.
[10427.298071] XFS (dm-10): xfs_log_force: error 5 returned.
[10427.425786] XFS (dm-12): xfs_log_force: error 5 returned.
[10427.425788] XFS (dm-5): xfs_log_force: error 5 returned.
[10427.425903] XFS (dm-11): xfs_log_force: error 5 returned.
[10428.830718] XFS (dm-8): xfs_log_force: error 5 returned.
[10457.312520] XFS (dm-4): xfs_log_force: error 5 returned.
[10457.312522] XFS (dm-6): xfs_log_force: error 5 returned.
[10457.312524] XFS (dm-10): xfs_log_force: error 5 returned.
[10457.440244] XFS (dm-12): xfs_log_force: error 5 returned.
[10457.440247] XFS (dm-5): xfs_log_force: error 5 returned.
[10457.440274] XFS (dm-11): xfs_log_force: error 5 returned.
[10458.845176] XFS (dm-8): xfs_log_force: error 5 returned.
[10487.326981] XFS (dm-4): xfs_log_force: error 5 returned.
[10487.326984] XFS (dm-6): xfs_log_force: error 5 returned.
[10487.326986] XFS (dm-10): xfs_log_force: error 5 returned.
[10487.454704] XFS (dm-12): xfs_log_force: error 5 returned.
[10487.454707] XFS (dm-11): xfs_log_force: error 5 returned.
[10487.454709] XFS (dm-5): xfs_log_force: error 5 returned.
[10488.859636] XFS (dm-8): xfs_log_force: error 5 returned.
[10517.341441] XFS (dm-4): xfs_log_force: error 5 returned.
[10517.341443] XFS (dm-6): xfs_log_force: error 5 returned.
[10517.341445] XFS (dm-10): xfs_log_force: error 5 returned.
[10517.469158] XFS (dm-5): xfs_log_force: error 5 returned.
[10517.469161] XFS (dm-12): xfs_log_force: error 5 returned.
[10517.469178] XFS (dm-11): xfs_log_force: error 5 returned.
[10518.874098] XFS (dm-8): xfs_log_force: error 5 returned.
[10547.355890] XFS (dm-4): xfs_log_force: error 5 returned.
[10547.355893] XFS (dm-6): xfs_log_force: error 5 returned.
[10547.355895] XFS (dm-10): xfs_log_force: error 5 returned.
[10547.483615] XFS (dm-12): xfs_log_force: error 5 returned.
[10547.483617] XFS (dm-5): xfs_log_force: error 5 returned.
[10547.483634] XFS (dm-11): xfs_log_force: error 5 returned.
[10548.888540] XFS (dm-8): xfs_log_force: error 5 returned.
[10577.370346] XFS (dm-4): xfs_log_force: error 5 returned.
[10577.370349] XFS (dm-6): xfs_log_force: error 5 returned.
[10577.370351] XFS (dm-10): xfs_log_force: error 5 returned.
[10577.498065] XFS (dm-12): xfs_log_force: error 5 returned.
[10577.498067] XFS (dm-5): xfs_log_force: error 5 returned.
[10577.498076] XFS (dm-11): xfs_log_force: error 5 returned.
[10578.902998] XFS (dm-8): xfs_log_force: error 5 returned.
[10607.384795] XFS (dm-4): xfs_log_force: error 5 returned.
[10607.384798] XFS (dm-6): xfs_log_force: error 5 returned.
[10607.384800] XFS (dm-10): xfs_log_force: error 5 returned.
[10607.512518] XFS (dm-12): xfs_log_force: error 5 returned.
[10607.512522] XFS (dm-5): xfs_log_force: error 5 returned.
[10607.512526] XFS (dm-11): xfs_log_force: error 5 returned.
[10608.917453] XFS (dm-8): xfs_log_force: error 5 returned.
[10637.399253] XFS (dm-4): xfs_log_force: error 5 returned.
[10637.399255] XFS (dm-6): xfs_log_force: error 5 returned.
[10637.399257] XFS (dm-10): xfs_log_force: error 5 returned.
[10637.526974] XFS (dm-5): xfs_log_force: error 5 returned.
[10637.526977] XFS (dm-12): xfs_log_force: error 5 returned.
[10637.526986] XFS (dm-11): xfs_log_force: error 5 returned.
[10638.931911] XFS (dm-8): xfs_log_force: error 5 returned.
[10667.413713] XFS (dm-4): xfs_log_force: error 5 returned.
[10667.413718] XFS (dm-6): xfs_log_force: error 5 returned.
[10667.413721] XFS (dm-10): xfs_log_force: error 5 returned.
[10667.541430] XFS (dm-5): xfs_log_force: error 5 returned.
[10667.541432] XFS (dm-11): xfs_log_force: error 5 returned.
[10667.541435] XFS (dm-12): xfs_log_force: error 5 returned.
[10668.946362] XFS (dm-8): xfs_log_force: error 5 returned.
[10697.428165] XFS (dm-4): xfs_log_force: error 5 returned.
[10697.428168] XFS (dm-6): xfs_log_force: error 5 returned.
[10697.428170] XFS (dm-10): xfs_log_force: error 5 returned.
[10697.555889] XFS (dm-5): xfs_log_force: error 5 returned.
[10697.555892] XFS (dm-12): xfs_log_force: error 5 returned.
[10697.555927] XFS (dm-11): xfs_log_force: error 5 returned.
[10698.960823] XFS (dm-8): xfs_log_force: error 5 returned.
[10727.442626] XFS (dm-4): xfs_log_force: error 5 returned.
[10727.442628] XFS (dm-6): xfs_log_force: error 5 returned.
[10727.442630] XFS (dm-10): xfs_log_force: error 5 returned.
[10727.570343] XFS (dm-12): xfs_log_force: error 5 returned.
[10727.570346] XFS (dm-11): xfs_log_force: error 5 returned.
[10727.570348] XFS (dm-5): xfs_log_force: error 5 returned.
[10728.975280] XFS (dm-8): xfs_log_force: error 5 returned.
[10757.457075] XFS (dm-4): xfs_log_force: error 5 returned.
[10757.457078] XFS (dm-6): xfs_log_force: error 5 returned.
[10757.457080] XFS (dm-10): xfs_log_force: error 5 returned.
[10757.584801] XFS (dm-12): xfs_log_force: error 5 returned.
[10757.584803] XFS (dm-5): xfs_log_force: error 5 returned.
[10757.584811] XFS (dm-11): xfs_log_force: error 5 returned.
[10758.989731] XFS (dm-8): xfs_log_force: error 5 returned.
[10787.471536] XFS (dm-4): xfs_log_force: error 5 returned.
[10787.471538] XFS (dm-6): xfs_log_force: error 5 returned.
[10787.471540] XFS (dm-10): xfs_log_force: error 5 returned.
[10787.599258] XFS (dm-5): xfs_log_force: error 5 returned.
[10787.599260] XFS (dm-11): xfs_log_force: error 5 returned.
[10787.599267] XFS (dm-12): xfs_log_force: error 5 returned.
[10789.004195] XFS (dm-8): xfs_log_force: error 5 returned.
[10817.485997] XFS (dm-4): xfs_log_force: error 5 returned.
[10817.485999] XFS (dm-6): xfs_log_force: error 5 returned.
[10817.486001] XFS (dm-10): xfs_log_force: error 5 returned.
[10817.613714] XFS (dm-5): xfs_log_force: error 5 returned.
[10817.613717] XFS (dm-12): xfs_log_force: error 5 returned.
[10817.613752] XFS (dm-11): xfs_log_force: error 5 returned.
[10819.018655] XFS (dm-8): xfs_log_force: error 5 returned.
[10847.500451] XFS (dm-4): xfs_log_force: error 5 returned.
[10847.500453] XFS (dm-6): xfs_log_force: error 5 returned.
[10847.500455] XFS (dm-10): xfs_log_force: error 5 returned.
[10847.628176] XFS (dm-12): xfs_log_force: error 5 returned.
[10847.628178] XFS (dm-5): xfs_log_force: error 5 returned.
[10847.628280] XFS (dm-11): xfs_log_force: error 5 returned.
[10849.033107] XFS (dm-8): xfs_log_force: error 5 returned.
[10877.514912] XFS (dm-4): xfs_log_force: error 5 returned.
[10877.514915] XFS (dm-6): xfs_log_force: error 5 returned.
[10877.514917] XFS (dm-10): xfs_log_force: error 5 returned.
[10877.642630] XFS (dm-12): xfs_log_force: error 5 returned.
[10877.642633] XFS (dm-5): xfs_log_force: error 5 returned.
[10877.642635] XFS (dm-11): xfs_log_force: error 5 returned.
[10879.047564] XFS (dm-8): xfs_log_force: error 5 returned.
[10907.529364] XFS (dm-4): xfs_log_force: error 5 returned.
[10907.529367] XFS (dm-6): xfs_log_force: error 5 returned.
[10907.529369] XFS (dm-10): xfs_log_force: error 5 returned.
[10907.657089] XFS (dm-12): xfs_log_force: error 5 returned.
[10907.657091] XFS (dm-5): xfs_log_force: error 5 returned.
[10907.657225] XFS (dm-11): xfs_log_force: error 5 returned.
[10909.062023] XFS (dm-8): xfs_log_force: error 5 returned.
[10937.543825] XFS (dm-4): xfs_log_force: error 5 returned.
[10937.543828] XFS (dm-6): xfs_log_force: error 5 returned.
[10937.543830] XFS (dm-10): xfs_log_force: error 5 returned.
[10937.671547] XFS (dm-5): xfs_log_force: error 5 returned.
[10937.671551] XFS (dm-12): xfs_log_force: error 5 returned.
[10937.671640] XFS (dm-11): xfs_log_force: error 5 returned.
[10939.076479] XFS (dm-8): xfs_log_force: error 5 returned.
[10967.558287] XFS (dm-4): xfs_log_force: error 5 returned.
[10967.558289] XFS (dm-6): xfs_log_force: error 5 returned.
[10967.558291] XFS (dm-10): xfs_log_force: error 5 returned.
[10967.686004] XFS (dm-5): xfs_log_force: error 5 returned.
[10967.686010] XFS (dm-12): xfs_log_force: error 5 returned.
[10967.686048] XFS (dm-11): xfs_log_force: error 5 returned.
[10969.090939] XFS (dm-8): xfs_log_force: error 5 returned.
[10997.572747] XFS (dm-4): xfs_log_force: error 5 returned.
[10997.572750] XFS (dm-6): xfs_log_force: error 5 returned.
[10997.572752] XFS (dm-10): xfs_log_force: error 5 returned.
[10997.700464] XFS (dm-12): xfs_log_force: error 5 returned.
[10997.700467] XFS (dm-11): xfs_log_force: error 5 returned.
[10997.700473] XFS (dm-5): xfs_log_force: error 5 returned.
[10999.105399] XFS (dm-8): xfs_log_force: error 5 returned.
[11027.587201] XFS (dm-4): xfs_log_force: error 5 returned.
[11027.587204] XFS (dm-6): xfs_log_force: error 5 returned.
[11027.587206] XFS (dm-10): xfs_log_force: error 5 returned.
[11027.714921] XFS (dm-12): xfs_log_force: error 5 returned.
[11027.714923] XFS (dm-5): xfs_log_force: error 5 returned.
[11027.714937] XFS (dm-11): xfs_log_force: error 5 returned.
[11029.119850] XFS (dm-8): xfs_log_force: error 5 returned.
[11057.601654] XFS (dm-4): xfs_log_force: error 5 returned.
[11057.601657] XFS (dm-6): xfs_log_force: error 5 returned.
[11057.601659] XFS (dm-10): xfs_log_force: error 5 returned.
[11057.729380] XFS (dm-12): xfs_log_force: error 5 returned.
[11057.729382] XFS (dm-5): xfs_log_force: error 5 returned.
[11057.729403] XFS (dm-11): xfs_log_force: error 5 returned.
[11059.134312] XFS (dm-8): xfs_log_force: error 5 returned.
[11087.616116] XFS (dm-4): xfs_log_force: error 5 returned.
[11087.616118] XFS (dm-6): xfs_log_force: error 5 returned.
[11087.616120] XFS (dm-10): xfs_log_force: error 5 returned.
[11087.743836] XFS (dm-5): xfs_log_force: error 5 returned.
[11087.743839] XFS (dm-12): xfs_log_force: error 5 returned.
[11087.743841] XFS (dm-11): xfs_log_force: error 5 returned.
[11089.148781] XFS (dm-8): xfs_log_force: error 5 returned.
[11117.630578] XFS (dm-4): xfs_log_force: error 5 returned.
[11117.630580] XFS (dm-6): xfs_log_force: error 5 returned.
[11117.630582] XFS (dm-10): xfs_log_force: error 5 returned.
[11117.758294] XFS (dm-5): xfs_log_force: error 5 returned.
[11117.758297] XFS (dm-12): xfs_log_force: error 5 returned.
[11117.758300] XFS (dm-11): xfs_log_force: error 5 returned.
[11119.163228] XFS (dm-8): xfs_log_force: error 5 returned.
[11147.645032] XFS (dm-4): xfs_log_force: error 5 returned.
[11147.645034] XFS (dm-6): xfs_log_force: error 5 returned.
[11147.645037] XFS (dm-10): xfs_log_force: error 5 returned.
[11147.772757] XFS (dm-12): xfs_log_force: error 5 returned.
[11147.772759] XFS (dm-5): xfs_log_force: error 5 returned.
[11147.772809] XFS (dm-11): xfs_log_force: error 5 returned.
[11149.177684] XFS (dm-8): xfs_log_force: error 5 returned.
[11177.659492] XFS (dm-4): xfs_log_force: error 5 returned.
[11177.659495] XFS (dm-6): xfs_log_force: error 5 returned.
[11177.659497] XFS (dm-10): xfs_log_force: error 5 returned.
[11177.787211] XFS (dm-5): xfs_log_force: error 5 returned.
[11177.787214] XFS (dm-12): xfs_log_force: error 5 returned.
[11177.787275] XFS (dm-11): xfs_log_force: error 5 returned.
[11179.192144] XFS (dm-8): xfs_log_force: error 5 returned.
[11207.673945] XFS (dm-4): xfs_log_force: error 5 returned.
[11207.673947] XFS (dm-6): xfs_log_force: error 5 returned.
[11207.673949] XFS (dm-10): xfs_log_force: error 5 returned.
[11207.801669] XFS (dm-12): xfs_log_force: error 5 returned.
[11207.801672] XFS (dm-5): xfs_log_force: error 5 returned.
[11207.801745] XFS (dm-11): xfs_log_force: error 5 returned.
[11209.206600] XFS (dm-8): xfs_log_force: error 5 returned.
[11237.688406] XFS (dm-4): xfs_log_force: error 5 returned.
[11237.688409] XFS (dm-6): xfs_log_force: error 5 returned.
[11237.688411] XFS (dm-10): xfs_log_force: error 5 returned.
[11237.816127] XFS (dm-5): xfs_log_force: error 5 returned.
[11237.816137] XFS (dm-12): xfs_log_force: error 5 returned.
[11237.816211] XFS (dm-11): xfs_log_force: error 5 returned.
[11239.221062] XFS (dm-8): xfs_log_force: error 5 returned.
[11267.702868] XFS (dm-4): xfs_log_force: error 5 returned.
[11267.702870] XFS (dm-6): xfs_log_force: error 5 returned.
[11267.702872] XFS (dm-10): xfs_log_force: error 5 returned.
[11267.830586] XFS (dm-12): xfs_log_force: error 5 returned.
[11267.830588] XFS (dm-5): xfs_log_force: error 5 returned.
[11267.830594] XFS (dm-11): xfs_log_force: error 5 returned.
[11269.235525] XFS (dm-8): xfs_log_force: error 5 returned.
[11297.717327] XFS (dm-4): xfs_log_force: error 5 returned.
[11297.717330] XFS (dm-6): xfs_log_force: error 5 returned.
[11297.717332] XFS (dm-10): xfs_log_force: error 5 returned.
[11297.845046] XFS (dm-12): xfs_log_force: error 5 returned.
[11297.845048] XFS (dm-5): xfs_log_force: error 5 returned.
[11297.845147] XFS (dm-11): xfs_log_force: error 5 returned.
[11299.249971] XFS (dm-8): xfs_log_force: error 5 returned.
[11327.731781] XFS (dm-4): xfs_log_force: error 5 returned.
[11327.731783] XFS (dm-6): xfs_log_force: error 5 returned.
[11327.731785] XFS (dm-10): xfs_log_force: error 5 returned.
[11327.859499] XFS (dm-12): xfs_log_force: error 5 returned.
[11327.859501] XFS (dm-5): xfs_log_force: error 5 returned.
[11327.859520] XFS (dm-11): xfs_log_force: error 5 returned.
[11329.264433] XFS (dm-8): xfs_log_force: error 5 returned.
[11357.746234] XFS (dm-4): xfs_log_force: error 5 returned.
[11357.746236] XFS (dm-6): xfs_log_force: error 5 returned.
[11357.746238] XFS (dm-10): xfs_log_force: error 5 returned.
[11357.873957] XFS (dm-12): xfs_log_force: error 5 returned.
[11357.873960] XFS (dm-5): xfs_log_force: error 5 returned.
[11357.874008] XFS (dm-11): xfs_log_force: error 5 returned.
[11359.278892] XFS (dm-8): xfs_log_force: error 5 returned.
[11387.760696] XFS (dm-4): xfs_log_force: error 5 returned.
[11387.760699] XFS (dm-6): xfs_log_force: error 5 returned.
[11387.760701] XFS (dm-10): xfs_log_force: error 5 returned.
[11387.888417] XFS (dm-5): xfs_log_force: error 5 returned.
[11387.888420] XFS (dm-12): xfs_log_force: error 5 returned.
[11387.888428] XFS (dm-11): xfs_log_force: error 5 returned.
[11389.293360] XFS (dm-8): xfs_log_force: error 5 returned.
[11417.775159] XFS (dm-4): xfs_log_force: error 5 returned.
[11417.775161] XFS (dm-6): xfs_log_force: error 5 returned.
[11417.775163] XFS (dm-10): xfs_log_force: error 5 returned.
[11417.902876] XFS (dm-5): xfs_log_force: error 5 returned.
[11417.902883] XFS (dm-12): xfs_log_force: error 5 returned.
[11417.902901] XFS (dm-11): xfs_log_force: error 5 returned.
[11419.307809] XFS (dm-8): xfs_log_force: error 5 returned.
[11447.789615] XFS (dm-4): xfs_log_force: error 5 returned.
[11447.789618] XFS (dm-6): xfs_log_force: error 5 returned.
[11447.789620] XFS (dm-10): xfs_log_force: error 5 returned.
[11447.917340] XFS (dm-12): xfs_log_force: error 5 returned.
[11447.917342] XFS (dm-5): xfs_log_force: error 5 returned.
[11447.917345] XFS (dm-11): xfs_log_force: error 5 returned.
[11449.322274] XFS (dm-8): xfs_log_force: error 5 returned.
[11477.804079] XFS (dm-4): xfs_log_force: error 5 returned.
[11477.804082] XFS (dm-6): xfs_log_force: error 5 returned.
[11477.804084] XFS (dm-10): xfs_log_force: error 5 returned.
[11477.931797] XFS (dm-12): xfs_log_force: error 5 returned.
[11477.931800] XFS (dm-11): xfs_log_force: error 5 returned.
[11477.931802] XFS (dm-5): xfs_log_force: error 5 returned.
[11479.336727] XFS (dm-8): xfs_log_force: error 5 returned.
[11507.818531] XFS (dm-4): xfs_log_force: error 5 returned.
[11507.818534] XFS (dm-6): xfs_log_force: error 5 returned.
[11507.818536] XFS (dm-10): xfs_log_force: error 5 returned.
[11507.946256] XFS (dm-12): xfs_log_force: error 5 returned.
[11507.946259] XFS (dm-11): xfs_log_force: error 5 returned.
[11507.946261] XFS (dm-5): xfs_log_force: error 5 returned.
[11509.351188] XFS (dm-8): xfs_log_force: error 5 returned.
[11537.832993] XFS (dm-4): xfs_log_force: error 5 returned.
[11537.832995] XFS (dm-6): xfs_log_force: error 5 returned.
[11537.832997] XFS (dm-10): xfs_log_force: error 5 returned.
[11537.960715] XFS (dm-5): xfs_log_force: error 5 returned.
[11537.960722] XFS (dm-11): xfs_log_force: error 5 returned.
[11537.960725] XFS (dm-12): xfs_log_force: error 5 returned.
[11539.365653] XFS (dm-8): xfs_log_force: error 5 returned.
[11567.847456] XFS (dm-4): xfs_log_force: error 5 returned.
[11567.847459] XFS (dm-6): xfs_log_force: error 5 returned.
[11567.847461] XFS (dm-10): xfs_log_force: error 5 returned.
[11567.975173] XFS (dm-5): xfs_log_force: error 5 returned.
[11567.975175] XFS (dm-12): xfs_log_force: error 5 returned.
[11567.975195] XFS (dm-11): xfs_log_force: error 5 returned.
[11569.380114] XFS (dm-8): xfs_log_force: error 5 returned.
[11597.861911] XFS (dm-4): xfs_log_force: error 5 returned.
[11597.861913] XFS (dm-6): xfs_log_force: error 5 returned.
[11597.861915] XFS (dm-10): xfs_log_force: error 5 returned.
[11597.989636] XFS (dm-12): xfs_log_force: error 5 returned.
[11597.989638] XFS (dm-5): xfs_log_force: error 5 returned.
[11597.989641] XFS (dm-11): xfs_log_force: error 5 returned.
[11599.394567] XFS (dm-8): xfs_log_force: error 5 returned.
[11627.876371] XFS (dm-4): xfs_log_force: error 5 returned.
[11627.876374] XFS (dm-6): xfs_log_force: error 5 returned.
[11627.876376] XFS (dm-10): xfs_log_force: error 5 returned.
[11628.004088] XFS (dm-12): xfs_log_force: error 5 returned.
[11628.004090] XFS (dm-5): xfs_log_force: error 5 returned.
[11628.004093] XFS (dm-11): xfs_log_force: error 5 returned.
[11629.409022] XFS (dm-8): xfs_log_force: error 5 returned.
[11657.890821] XFS (dm-4): xfs_log_force: error 5 returned.
[11657.890823] XFS (dm-6): xfs_log_force: error 5 returned.
[11657.890825] XFS (dm-10): xfs_log_force: error 5 returned.
[11658.018546] XFS (dm-12): xfs_log_force: error 5 returned.
[11658.018548] XFS (dm-5): xfs_log_force: error 5 returned.
[11658.018600] XFS (dm-11): xfs_log_force: error 5 returned.
[11659.423479] XFS (dm-8): xfs_log_force: error 5 returned.
[11687.905280] XFS (dm-4): xfs_log_force: error 5 returned.
[11687.905282] XFS (dm-6): xfs_log_force: error 5 returned.
[11687.905284] XFS (dm-10): xfs_log_force: error 5 returned.
[11688.033002] XFS (dm-5): xfs_log_force: error 5 returned.
[11688.033005] XFS (dm-11): xfs_log_force: error 5 returned.
[11688.033008] XFS (dm-12): xfs_log_force: error 5 returned.
[11689.437934] XFS (dm-8): xfs_log_force: error 5 returned.
[11717.919741] XFS (dm-4): xfs_log_force: error 5 returned.
[11717.919743] XFS (dm-6): xfs_log_force: error 5 returned.
[11717.919745] XFS (dm-10): xfs_log_force: error 5 returned.
[11718.047460] XFS (dm-5): xfs_log_force: error 5 returned.
[11718.047465] XFS (dm-12): xfs_log_force: error 5 returned.
[11718.047476] XFS (dm-11): xfs_log_force: error 5 returned.
[11719.452394] XFS (dm-8): xfs_log_force: error 5 returned.
[11747.934194] XFS (dm-4): xfs_log_force: error 5 returned.
[11747.934196] XFS (dm-6): xfs_log_force: error 5 returned.
[11747.934198] XFS (dm-10): xfs_log_force: error 5 returned.
[11748.061917] XFS (dm-12): xfs_log_force: error 5 returned.
[11748.061919] XFS (dm-5): xfs_log_force: error 5 returned.
[11748.061959] XFS (dm-11): xfs_log_force: error 5 returned.
[11749.466852] XFS (dm-8): xfs_log_force: error 5 returned.
[11777.948654] XFS (dm-4): xfs_log_force: error 5 returned.
[11777.948656] XFS (dm-6): xfs_log_force: error 5 returned.
[11777.948658] XFS (dm-10): xfs_log_force: error 5 returned.
[11778.076372] XFS (dm-12): xfs_log_force: error 5 returned.
[11778.076374] XFS (dm-5): xfs_log_force: error 5 returned.
[11778.076390] XFS (dm-11): xfs_log_force: error 5 returned.
[11779.481303] XFS (dm-8): xfs_log_force: error 5 returned.
[11807.963106] XFS (dm-4): xfs_log_force: error 5 returned.
[11807.963108] XFS (dm-6): xfs_log_force: error 5 returned.
[11807.963110] XFS (dm-10): xfs_log_force: error 5 returned.
[11808.090831] XFS (dm-12): xfs_log_force: error 5 returned.
[11808.090833] XFS (dm-5): xfs_log_force: error 5 returned.
[11808.090895] XFS (dm-11): xfs_log_force: error 5 returned.
[11809.495764] XFS (dm-8): xfs_log_force: error 5 returned.
[11837.977565] XFS (dm-4): xfs_log_force: error 5 returned.
[11837.977568] XFS (dm-6): xfs_log_force: error 5 returned.
[11837.977570] XFS (dm-10): xfs_log_force: error 5 returned.
[11838.105286] XFS (dm-5): xfs_log_force: error 5 returned.
[11838.105288] XFS (dm-12): xfs_log_force: error 5 returned.
[11838.105291] XFS (dm-11): xfs_log_force: error 5 returned.
[11839.510230] XFS (dm-8): xfs_log_force: error 5 returned.
[11867.992026] XFS (dm-4): xfs_log_force: error 5 returned.
[11867.992029] XFS (dm-6): xfs_log_force: error 5 returned.
[11867.992031] XFS (dm-10): xfs_log_force: error 5 returned.
[11868.119743] XFS (dm-5): xfs_log_force: error 5 returned.
[11868.119746] XFS (dm-12): xfs_log_force: error 5 returned.
[11868.119854] XFS (dm-11): xfs_log_force: error 5 returned.
[11869.524677] XFS (dm-8): xfs_log_force: error 5 returned.
[11898.006481] XFS (dm-4): xfs_log_force: error 5 returned.
[11898.006484] XFS (dm-6): xfs_log_force: error 5 returned.
[11898.006486] XFS (dm-10): xfs_log_force: error 5 returned.
[11898.134205] XFS (dm-12): xfs_log_force: error 5 returned.
[11898.134208] XFS (dm-5): xfs_log_force: error 5 returned.
[11898.134275] XFS (dm-11): xfs_log_force: error 5 returned.
[11899.539132] XFS (dm-8): xfs_log_force: error 5 returned.
[11928.020942] XFS (dm-4): xfs_log_force: error 5 returned.
[11928.020945] XFS (dm-6): xfs_log_force: error 5 returned.
[11928.020947] XFS (dm-10): xfs_log_force: error 5 returned.
[11928.148659] XFS (dm-5): xfs_log_force: error 5 returned.
[11928.148661] XFS (dm-11): xfs_log_force: error 5 returned.
[11928.148664] XFS (dm-12): xfs_log_force: error 5 returned.
[11929.553593] XFS (dm-8): xfs_log_force: error 5 returned.
[11958.035392] XFS (dm-4): xfs_log_force: error 5 returned.
[11958.035395] XFS (dm-6): xfs_log_force: error 5 returned.
[11958.035397] XFS (dm-10): xfs_log_force: error 5 returned.
[11958.163116] XFS (dm-12): xfs_log_force: error 5 returned.
[11958.163119] XFS (dm-5): xfs_log_force: error 5 returned.
[11958.163218] XFS (dm-11): xfs_log_force: error 5 returned.
[11959.568048] XFS (dm-8): xfs_log_force: error 5 returned.
[11988.049853] XFS (dm-4): xfs_log_force: error 5 returned.
[11988.049856] XFS (dm-6): xfs_log_force: error 5 returned.
[11988.049858] XFS (dm-10): xfs_log_force: error 5 returned.
[11988.177573] XFS (dm-5): xfs_log_force: error 5 returned.
[11988.177576] XFS (dm-12): xfs_log_force: error 5 returned.
[11988.177579] XFS (dm-11): xfs_log_force: error 5 returned.
[11989.582510] XFS (dm-8): xfs_log_force: error 5 returned.
[12018.064314] XFS (dm-4): xfs_log_force: error 5 returned.
[12018.064317] XFS (dm-6): xfs_log_force: error 5 returned.
[12018.064319] XFS (dm-10): xfs_log_force: error 5 returned.
[12018.192031] XFS (dm-5): xfs_log_force: error 5 returned.
[12018.192034] XFS (dm-12): xfs_log_force: error 5 returned.
[12018.192163] XFS (dm-11): xfs_log_force: error 5 returned.
[12019.596973] XFS (dm-8): xfs_log_force: error 5 returned.
[12048.078769] XFS (dm-4): xfs_log_force: error 5 returned.
[12048.078772] XFS (dm-6): xfs_log_force: error 5 returned.
[12048.078773] XFS (dm-10): xfs_log_force: error 5 returned.
[12048.206493] XFS (dm-12): xfs_log_force: error 5 returned.
[12048.206496] XFS (dm-11): xfs_log_force: error 5 returned.
[12048.206499] XFS (dm-5): xfs_log_force: error 5 returned.
[12049.611419] XFS (dm-8): xfs_log_force: error 5 returned.
[12078.093229] XFS (dm-4): xfs_log_force: error 5 returned.
[12078.093232] XFS (dm-6): xfs_log_force: error 5 returned.
[12078.093234] XFS (dm-10): xfs_log_force: error 5 returned.
[12078.220948] XFS (dm-12): xfs_log_force: error 5 returned.
[12078.220950] XFS (dm-5): xfs_log_force: error 5 returned.
[12078.220961] XFS (dm-11): xfs_log_force: error 5 returned.
[12079.625881] XFS (dm-8): xfs_log_force: error 5 returned.
[12108.107681] XFS (dm-4): xfs_log_force: error 5 returned.
[12108.107684] XFS (dm-6): xfs_log_force: error 5 returned.
[12108.107686] XFS (dm-10): xfs_log_force: error 5 returned.
[12108.235404] XFS (dm-12): xfs_log_force: error 5 returned.
[12108.235407] XFS (dm-5): xfs_log_force: error 5 returned.
[12108.235431] XFS (dm-11): xfs_log_force: error 5 returned.
[12109.640340] XFS (dm-8): xfs_log_force: error 5 returned.
[12138.122142] XFS (dm-4): xfs_log_force: error 5 returned.
[12138.122144] XFS (dm-6): xfs_log_force: error 5 returned.
[12138.122146] XFS (dm-10): xfs_log_force: error 5 returned.
[12138.249863] XFS (dm-5): xfs_log_force: error 5 returned.
[12138.249865] XFS (dm-12): xfs_log_force: error 5 returned.
[12138.249874] XFS (dm-11): xfs_log_force: error 5 returned.
[12139.654807] XFS (dm-8): xfs_log_force: error 5 returned.
[12168.136603] XFS (dm-4): xfs_log_force: error 5 returned.
[12168.136606] XFS (dm-6): xfs_log_force: error 5 returned.
[12168.136608] XFS (dm-10): xfs_log_force: error 5 returned.
[12168.264320] XFS (dm-5): xfs_log_force: error 5 returned.
[12168.264328] XFS (dm-12): xfs_log_force: error 5 returned.
[12168.264346] XFS (dm-11): xfs_log_force: error 5 returned.
[12169.669253] XFS (dm-8): xfs_log_force: error 5 returned.
[12198.151059] XFS (dm-4): xfs_log_force: error 5 returned.
[12198.151062] XFS (dm-6): xfs_log_force: error 5 returned.
[12198.151064] XFS (dm-10): xfs_log_force: error 5 returned.
[12198.278783] XFS (dm-12): xfs_log_force: error 5 returned.
[12198.278786] XFS (dm-11): xfs_log_force: error 5 returned.
[12198.278788] XFS (dm-5): xfs_log_force: error 5 returned.
[12199.683717] XFS (dm-8): xfs_log_force: error 5 returned.
[12228.165522] XFS (dm-4): xfs_log_force: error 5 returned.
[12228.165525] XFS (dm-6): xfs_log_force: error 5 returned.
[12228.165527] XFS (dm-10): xfs_log_force: error 5 returned.
[12228.293240] XFS (dm-12): xfs_log_force: error 5 returned.
[12228.293243] XFS (dm-5): xfs_log_force: error 5 returned.
[12228.293293] XFS (dm-11): xfs_log_force: error 5 returned.
[12229.698172] XFS (dm-8): xfs_log_force: error 5 returned.
[12258.179977] XFS (dm-4): xfs_log_force: error 5 returned.
[12258.179979] XFS (dm-6): xfs_log_force: error 5 returned.
[12258.179981] XFS (dm-10): xfs_log_force: error 5 returned.
[12258.307702] XFS (dm-12): xfs_log_force: error 5 returned.
[12258.307704] XFS (dm-5): xfs_log_force: error 5 returned.
[12258.307711] XFS (dm-11): xfs_log_force: error 5 returned.
[12259.712633] XFS (dm-8): xfs_log_force: error 5 returned.
[12288.194440] XFS (dm-4): xfs_log_force: error 5 returned.
[12288.194442] XFS (dm-6): xfs_log_force: error 5 returned.
[12288.194444] XFS (dm-10): xfs_log_force: error 5 returned.
[12288.322159] XFS (dm-5): xfs_log_force: error 5 returned.
[12288.322171] XFS (dm-12): xfs_log_force: error 5 returned.
[12288.322241] XFS (dm-11): xfs_log_force: error 5 returned.
[12289.727099] XFS (dm-8): xfs_log_force: error 5 returned.
[12318.208903] XFS (dm-4): xfs_log_force: error 5 returned.
[12318.208906] XFS (dm-6): xfs_log_force: error 5 returned.
[12318.208908] XFS (dm-10): xfs_log_force: error 5 returned.
[12318.336620] XFS (dm-5): xfs_log_force: error 5 returned.
[12318.336622] XFS (dm-12): xfs_log_force: error 5 returned.
[12318.336658] XFS (dm-11): xfs_log_force: error 5 returned.
[12319.741560] XFS (dm-8): xfs_log_force: error 5 returned.
[12348.223359] XFS (dm-4): xfs_log_force: error 5 returned.
[12348.223361] XFS (dm-6): xfs_log_force: error 5 returned.
[12348.223363] XFS (dm-10): xfs_log_force: error 5 returned.
[12348.351083] XFS (dm-12): xfs_log_force: error 5 returned.
[12348.351086] XFS (dm-5): xfs_log_force: error 5 returned.
[12348.351088] XFS (dm-11): xfs_log_force: error 5 returned.
[12349.756016] XFS (dm-8): xfs_log_force: error 5 returned.
[12378.237820] XFS (dm-4): xfs_log_force: error 5 returned.
[12378.237823] XFS (dm-6): xfs_log_force: error 5 returned.
[12378.237824] XFS (dm-10): xfs_log_force: error 5 returned.
[12378.365539] XFS (dm-12): xfs_log_force: error 5 returned.
[12378.365541] XFS (dm-5): xfs_log_force: error 5 returned.
[12378.365612] XFS (dm-11): xfs_log_force: error 5 returned.
[12379.770473] XFS (dm-8): xfs_log_force: error 5 returned.
[12408.252275] XFS (dm-4): xfs_log_force: error 5 returned.
[12408.252278] XFS (dm-6): xfs_log_force: error 5 returned.
[12408.252280] XFS (dm-10): xfs_log_force: error 5 returned.
[12408.380000] XFS (dm-12): xfs_log_force: error 5 returned.
[12408.380003] XFS (dm-5): xfs_log_force: error 5 returned.
[12408.380131] XFS (dm-11): xfs_log_force: error 5 returned.
[12409.784934] XFS (dm-8): xfs_log_force: error 5 returned.
[12438.266739] XFS (dm-4): xfs_log_force: error 5 returned.
[12438.266741] XFS (dm-6): xfs_log_force: error 5 returned.
[12438.266743] XFS (dm-10): xfs_log_force: error 5 returned.
[12438.394459] XFS (dm-5): xfs_log_force: error 5 returned.
[12438.394464] XFS (dm-12): xfs_log_force: error 5 returned.
[12438.394480] XFS (dm-11): xfs_log_force: error 5 returned.
[12439.799392] XFS (dm-8): xfs_log_force: error 5 returned.
[12468.281202] XFS (dm-4): xfs_log_force: error 5 returned.
[12468.281204] XFS (dm-6): xfs_log_force: error 5 returned.
[12468.281206] XFS (dm-10): xfs_log_force: error 5 returned.
[12468.408918] XFS (dm-5): xfs_log_force: error 5 returned.
[12468.408920] XFS (dm-11): xfs_log_force: error 5 returned.
[12468.408925] XFS (dm-12): xfs_log_force: error 5 returned.
[12469.813853] XFS (dm-8): xfs_log_force: error 5 returned.
[12498.295656] XFS (dm-4): xfs_log_force: error 5 returned.
[12498.295659] XFS (dm-6): xfs_log_force: error 5 returned.
[12498.295661] XFS (dm-10): xfs_log_force: error 5 returned.
[12498.423378] XFS (dm-12): xfs_log_force: error 5 returned.
[12498.423380] XFS (dm-5): xfs_log_force: error 5 returned.
[12498.423383] XFS (dm-11): xfs_log_force: error 5 returned.
[12499.828314] XFS (dm-8): xfs_log_force: error 5 returned.
[12528.310117] XFS (dm-4): xfs_log_force: error 5 returned.
[12528.310121] XFS (dm-6): xfs_log_force: error 5 returned.
[12528.310124] XFS (dm-10): xfs_log_force: error 5 returned.
[12528.437834] XFS (dm-12): xfs_log_force: error 5 returned.
[12528.437837] XFS (dm-11): xfs_log_force: error 5 returned.
[12528.437840] XFS (dm-5): xfs_log_force: error 5 returned.
[12529.842765] XFS (dm-8): xfs_log_force: error 5 returned.
[12558.324568] XFS (dm-4): xfs_log_force: error 5 returned.
[12558.324571] XFS (dm-6): xfs_log_force: error 5 returned.
[12558.324573] XFS (dm-10): xfs_log_force: error 5 returned.
[12558.452293] XFS (dm-12): xfs_log_force: error 5 returned.
[12558.452296] XFS (dm-11): xfs_log_force: error 5 returned.
[12558.452298] XFS (dm-5): xfs_log_force: error 5 returned.
[12559.857225] XFS (dm-8): xfs_log_force: error 5 returned.
[12588.339030] XFS (dm-4): xfs_log_force: error 5 returned.
[12588.339034] XFS (dm-6): xfs_log_force: error 5 returned.
[12588.339037] XFS (dm-10): xfs_log_force: error 5 returned.
[12588.466750] XFS (dm-12): xfs_log_force: error 5 returned.
[12588.466753] XFS (dm-5): xfs_log_force: error 5 returned.
[12588.466755] XFS (dm-11): xfs_log_force: error 5 returned.
[12589.871694] XFS (dm-8): xfs_log_force: error 5 returned.
[12618.353493] XFS (dm-4): xfs_log_force: error 5 returned.
[12618.353497] XFS (dm-6): xfs_log_force: error 5 returned.
[12618.353499] XFS (dm-10): xfs_log_force: error 5 returned.
[12618.481208] XFS (dm-5): xfs_log_force: error 5 returned.
[12618.481211] XFS (dm-12): xfs_log_force: error 5 returned.
[12618.481221] XFS (dm-11): xfs_log_force: error 5 returned.
[12619.886143] XFS (dm-8): xfs_log_force: error 5 returned.
[12648.367945] XFS (dm-4): xfs_log_force: error 5 returned.
[12648.367949] XFS (dm-6): xfs_log_force: error 5 returned.
[12648.367951] XFS (dm-10): xfs_log_force: error 5 returned.
[12648.495670] XFS (dm-12): xfs_log_force: error 5 returned.
[12648.495672] XFS (dm-5): xfs_log_force: error 5 returned.
[12648.495700] XFS (dm-11): xfs_log_force: error 5 returned.
[12649.900597] XFS (dm-8): xfs_log_force: error 5 returned.
[12678.382406] XFS (dm-4): xfs_log_force: error 5 returned.
[12678.382409] XFS (dm-6): xfs_log_force: error 5 returned.
[12678.382412] XFS (dm-10): xfs_log_force: error 5 returned.
[12678.510124] XFS (dm-5): xfs_log_force: error 5 returned.
[12678.510127] XFS (dm-12): xfs_log_force: error 5 returned.
[12678.510167] XFS (dm-11): xfs_log_force: error 5 returned.
[12679.915058] XFS (dm-8): xfs_log_force: error 5 returned.
[12708.396859] XFS (dm-4): xfs_log_force: error 5 returned.
[12708.396862] XFS (dm-6): xfs_log_force: error 5 returned.
[12708.396864] XFS (dm-10): xfs_log_force: error 5 returned.
[12708.524583] XFS (dm-12): xfs_log_force: error 5 returned.
[12708.524585] XFS (dm-5): xfs_log_force: error 5 returned.
[12708.524589] XFS (dm-11): xfs_log_force: error 5 returned.
[12709.929514] XFS (dm-8): xfs_log_force: error 5 returned.
[12738.411316] XFS (dm-4): xfs_log_force: error 5 returned.
[12738.411319] XFS (dm-6): xfs_log_force: error 5 returned.
[12738.411321] XFS (dm-10): xfs_log_force: error 5 returned.
[12738.539039] XFS (dm-5): xfs_log_force: error 5 returned.
[12738.539043] XFS (dm-12): xfs_log_force: error 5 returned.
[12738.539101] XFS (dm-11): xfs_log_force: error 5 returned.
[12739.943976] XFS (dm-8): xfs_log_force: error 5 returned.
[12768.425781] XFS (dm-4): xfs_log_force: error 5 returned.
[12768.425784] XFS (dm-6): xfs_log_force: error 5 returned.
[12768.425786] XFS (dm-10): xfs_log_force: error 5 returned.
[12768.553499] XFS (dm-5): xfs_log_force: error 5 returned.
[12768.553501] XFS (dm-12): xfs_log_force: error 5 returned.
[12768.553504] XFS (dm-11): xfs_log_force: error 5 returned.
[12769.958432] XFS (dm-8): xfs_log_force: error 5 returned.
[12798.440235] XFS (dm-4): xfs_log_force: error 5 returned.
[12798.440238] XFS (dm-6): xfs_log_force: error 5 returned.
[12798.440240] XFS (dm-10): xfs_log_force: error 5 returned.
[12798.567960] XFS (dm-12): xfs_log_force: error 5 returned.
[12798.567962] XFS (dm-5): xfs_log_force: error 5 returned.
[12798.568044] XFS (dm-11): xfs_log_force: error 5 returned.
[12799.972886] XFS (dm-8): xfs_log_force: error 5 returned.
[12828.454698] XFS (dm-4): xfs_log_force: error 5 returned.
[12828.454701] XFS (dm-6): xfs_log_force: error 5 returned.
[12828.454703] XFS (dm-10): xfs_log_force: error 5 returned.
[12828.582415] XFS (dm-12): xfs_log_force: error 5 returned.
[12828.582418] XFS (dm-5): xfs_log_force: error 5 returned.
[12828.582437] XFS (dm-11): xfs_log_force: error 5 returned.
[12829.987349] XFS (dm-8): xfs_log_force: error 5 returned.
[12858.469149] XFS (dm-4): xfs_log_force: error 5 returned.
[12858.469152] XFS (dm-6): xfs_log_force: error 5 returned.
[12858.469154] XFS (dm-10): xfs_log_force: error 5 returned.
[12858.596872] XFS (dm-12): xfs_log_force: error 5 returned.
[12858.596877] XFS (dm-5): xfs_log_force: error 5 returned.
[12858.596973] XFS (dm-11): xfs_log_force: error 5 returned.
[12860.001803] XFS (dm-8): xfs_log_force: error 5 returned.
[12888.483609] XFS (dm-4): xfs_log_force: error 5 returned.
[12888.483611] XFS (dm-6): xfs_log_force: error 5 returned.
[12888.483613] XFS (dm-10): xfs_log_force: error 5 returned.
[12888.611333] XFS (dm-5): xfs_log_force: error 5 returned.
[12888.611336] XFS (dm-12): xfs_log_force: error 5 returned.
[12888.611344] XFS (dm-11): xfs_log_force: error 5 returned.
[12890.016269] XFS (dm-8): xfs_log_force: error 5 returned.
[12918.498074] XFS (dm-4): xfs_log_force: error 5 returned.
[12918.498077] XFS (dm-6): xfs_log_force: error 5 returned.
[12918.498079] XFS (dm-10): xfs_log_force: error 5 returned.
[12918.625789] XFS (dm-5): xfs_log_force: error 5 returned.
[12918.625792] XFS (dm-12): xfs_log_force: error 5 returned.
[12918.625825] XFS (dm-11): xfs_log_force: error 5 returned.
[12920.030724] XFS (dm-8): xfs_log_force: error 5 returned.
[12948.512529] XFS (dm-4): xfs_log_force: error 5 returned.
[12948.512532] XFS (dm-6): xfs_log_force: error 5 returned.
[12948.512534] XFS (dm-10): xfs_log_force: error 5 returned.
[12948.640255] XFS (dm-12): xfs_log_force: error 5 returned.
[12948.640257] XFS (dm-5): xfs_log_force: error 5 returned.
[12948.640290] XFS (dm-11): xfs_log_force: error 5 returned.
[12950.045188] XFS (dm-8): xfs_log_force: error 5 returned.
[12978.526990] XFS (dm-4): xfs_log_force: error 5 returned.
[12978.526992] XFS (dm-6): xfs_log_force: error 5 returned.
[12978.526994] XFS (dm-10): xfs_log_force: error 5 returned.
[12978.654709] XFS (dm-12): xfs_log_force: error 5 returned.
[12978.654711] XFS (dm-5): xfs_log_force: error 5 returned.
[12978.654768] XFS (dm-11): xfs_log_force: error 5 returned.
[12980.059641] XFS (dm-8): xfs_log_force: error 5 returned.
[13008.541442] XFS (dm-4): xfs_log_force: error 5 returned.
[13008.541445] XFS (dm-6): xfs_log_force: error 5 returned.
[13008.541447] XFS (dm-10): xfs_log_force: error 5 returned.
[13008.669167] XFS (dm-12): xfs_log_force: error 5 returned.
[13008.669169] XFS (dm-5): xfs_log_force: error 5 returned.
[13008.669178] XFS (dm-11): xfs_log_force: error 5 returned.
[13010.074097] XFS (dm-8): xfs_log_force: error 5 returned.
[13038.555900] XFS (dm-4): xfs_log_force: error 5 returned.
[13038.555902] XFS (dm-6): xfs_log_force: error 5 returned.
[13038.555904] XFS (dm-10): xfs_log_force: error 5 returned.
[13038.683624] XFS (dm-5): xfs_log_force: error 5 returned.
[13038.683627] XFS (dm-12): xfs_log_force: error 5 returned.
[13038.683707] XFS (dm-11): xfs_log_force: error 5 returned.
[13040.088560] XFS (dm-8): xfs_log_force: error 5 returned.
[13068.570364] XFS (dm-4): xfs_log_force: error 5 returned.
[13068.570366] XFS (dm-6): xfs_log_force: error 5 returned.
[13068.570368] XFS (dm-10): xfs_log_force: error 5 returned.
[13068.698079] XFS (dm-5): xfs_log_force: error 5 returned.
[13068.698082] XFS (dm-12): xfs_log_force: error 5 returned.
[13068.698118] XFS (dm-11): xfs_log_force: error 5 returned.
[13070.103014] XFS (dm-8): xfs_log_force: error 5 returned.
[13098.584818] XFS (dm-4): xfs_log_force: error 5 returned.
[13098.584820] XFS (dm-6): xfs_log_force: error 5 returned.
[13098.584822] XFS (dm-10): xfs_log_force: error 5 returned.
[13098.712543] XFS (dm-12): xfs_log_force: error 5 returned.
[13098.712545] XFS (dm-5): xfs_log_force: error 5 returned.
[13098.712547] XFS (dm-11): xfs_log_force: error 5 returned.
[13100.117474] XFS (dm-8): xfs_log_force: error 5 returned.
[13128.599279] XFS (dm-4): xfs_log_force: error 5 returned.
[13128.599281] XFS (dm-6): xfs_log_force: error 5 returned.
[13128.599283] XFS (dm-10): xfs_log_force: error 5 returned.
[13128.726996] XFS (dm-12): xfs_log_force: error 5 returned.
[13128.726999] XFS (dm-5): xfs_log_force: error 5 returned.
[13128.727015] XFS (dm-11): xfs_log_force: error 5 returned.
[13130.131930] XFS (dm-8): xfs_log_force: error 5 returned.
[13158.613730] XFS (dm-4): xfs_log_force: error 5 returned.
[13158.613733] XFS (dm-6): xfs_log_force: error 5 returned.
[13158.613734] XFS (dm-10): xfs_log_force: error 5 returned.
[13158.741456] XFS (dm-12): xfs_log_force: error 5 returned.
[13158.741458] XFS (dm-5): xfs_log_force: error 5 returned.
[13158.741483] XFS (dm-11): xfs_log_force: error 5 returned.
[13160.146390] XFS (dm-8): xfs_log_force: error 5 returned.
[13188.628191] XFS (dm-4): xfs_log_force: error 5 returned.
[13188.628195] XFS (dm-6): xfs_log_force: error 5 returned.
[13188.628197] XFS (dm-10): xfs_log_force: error 5 returned.
[13188.755912] XFS (dm-5): xfs_log_force: error 5 returned.
[13188.755919] XFS (dm-12): xfs_log_force: error 5 returned.
[13188.755935] XFS (dm-11): xfs_log_force: error 5 returned.
[13190.160846] XFS (dm-8): xfs_log_force: error 5 returned.
[13218.642653] XFS (dm-4): xfs_log_force: error 5 returned.
[13218.642656] XFS (dm-6): xfs_log_force: error 5 returned.
[13218.642658] XFS (dm-10): xfs_log_force: error 5 returned.
[13218.770368] XFS (dm-5): xfs_log_force: error 5 returned.
[13218.770378] XFS (dm-12): xfs_log_force: error 5 returned.
[13218.770413] XFS (dm-11): xfs_log_force: error 5 returned.
[13220.175303] XFS (dm-8): xfs_log_force: error 5 returned.
[13248.657114] XFS (dm-4): xfs_log_force: error 5 returned.
[13248.657116] XFS (dm-6): xfs_log_force: error 5 returned.
[13248.657118] XFS (dm-10): xfs_log_force: error 5 returned.
[13248.784830] XFS (dm-12): xfs_log_force: error 5 returned.
[13248.784838] XFS (dm-5): xfs_log_force: error 5 returned.
[13248.784872] XFS (dm-11): xfs_log_force: error 5 returned.
[13250.189765] XFS (dm-8): xfs_log_force: error 5 returned.
[13278.671569] XFS (dm-4): xfs_log_force: error 5 returned.
[13278.671572] XFS (dm-6): xfs_log_force: error 5 returned.
[13278.671574] XFS (dm-10): xfs_log_force: error 5 returned.
[13278.799286] XFS (dm-12): xfs_log_force: error 5 returned.
[13278.799288] XFS (dm-5): xfs_log_force: error 5 returned.
[13278.799305] XFS (dm-11): xfs_log_force: error 5 returned.
[13280.204216] XFS (dm-8): xfs_log_force: error 5 returned.
[13308.686020] XFS (dm-4): xfs_log_force: error 5 returned.
[13308.686022] XFS (dm-6): xfs_log_force: error 5 returned.
[13308.686024] XFS (dm-10): xfs_log_force: error 5 returned.
[13308.813745] XFS (dm-12): xfs_log_force: error 5 returned.
[13308.813747] XFS (dm-5): xfs_log_force: error 5 returned.
[13308.813810] XFS (dm-11): xfs_log_force: error 5 returned.
[13310.218673] XFS (dm-8): xfs_log_force: error 5 returned.
[13338.700477] XFS (dm-4): xfs_log_force: error 5 returned.
[13338.700480] XFS (dm-6): xfs_log_force: error 5 returned.
[13338.700482] XFS (dm-10): xfs_log_force: error 5 returned.
[13338.828201] XFS (dm-5): xfs_log_force: error 5 returned.
[13338.828203] XFS (dm-12): xfs_log_force: error 5 returned.
[13338.828247] XFS (dm-11): xfs_log_force: error 5 returned.
[13340.233139] XFS (dm-8): xfs_log_force: error 5 returned.
[13368.714947] XFS (dm-4): xfs_log_force: error 5 returned.
[13368.714950] XFS (dm-6): xfs_log_force: error 5 returned.
[13368.714952] XFS (dm-10): xfs_log_force: error 5 returned.
[13368.842657] XFS (dm-12): xfs_log_force: error 5 returned.
[13368.842659] XFS (dm-5): xfs_log_force: error 5 returned.
[13368.842761] XFS (dm-11): xfs_log_force: error 5 returned.
[13370.247592] XFS (dm-8): xfs_log_force: error 5 returned.
[13398.729394] XFS (dm-4): xfs_log_force: error 5 returned.
[13398.729397] XFS (dm-6): xfs_log_force: error 5 returned.
[13398.729399] XFS (dm-10): xfs_log_force: error 5 returned.
[13398.857119] XFS (dm-12): xfs_log_force: error 5 returned.
[13398.857121] XFS (dm-5): xfs_log_force: error 5 returned.
[13398.857189] XFS (dm-11): xfs_log_force: error 5 returned.
[13400.262047] XFS (dm-8): xfs_log_force: error 5 returned.
[13428.743854] XFS (dm-4): xfs_log_force: error 5 returned.
[13428.743857] XFS (dm-6): xfs_log_force: error 5 returned.
[13428.743858] XFS (dm-10): xfs_log_force: error 5 returned.
[13428.871572] XFS (dm-5): xfs_log_force: error 5 returned.
[13428.871575] XFS (dm-12): xfs_log_force: error 5 returned.
[13428.871685] XFS (dm-11): xfs_log_force: error 5 returned.
[13430.276507] XFS (dm-8): xfs_log_force: error 5 returned.
[13458.758306] XFS (dm-4): xfs_log_force: error 5 returned.
[13458.758309] XFS (dm-6): xfs_log_force: error 5 returned.
[13458.758311] XFS (dm-10): xfs_log_force: error 5 returned.
[13458.886030] XFS (dm-12): xfs_log_force: error 5 returned.
[13458.886033] XFS (dm-5): xfs_log_force: error 5 returned.
[13458.886132] XFS (dm-11): xfs_log_force: error 5 returned.
[13460.290960] XFS (dm-8): xfs_log_force: error 5 returned.
[13488.772763] XFS (dm-4): xfs_log_force: error 5 returned.
[13488.772766] XFS (dm-6): xfs_log_force: error 5 returned.
[13488.772768] XFS (dm-10): xfs_log_force: error 5 returned.
[13488.900488] XFS (dm-12): xfs_log_force: error 5 returned.
[13488.900490] XFS (dm-5): xfs_log_force: error 5 returned.
[13488.900619] XFS (dm-11): xfs_log_force: error 5 returned.
[13490.305424] XFS (dm-8): xfs_log_force: error 5 returned.
[13518.787228] XFS (dm-4): xfs_log_force: error 5 returned.
[13518.787231] XFS (dm-6): xfs_log_force: error 5 returned.
[13518.787232] XFS (dm-10): xfs_log_force: error 5 returned.
[13518.914943] XFS (dm-5): xfs_log_force: error 5 returned.
[13518.914946] XFS (dm-12): xfs_log_force: error 5 returned.
[13518.915075] XFS (dm-11): xfs_log_force: error 5 returned.
[13520.319879] XFS (dm-8): xfs_log_force: error 5 returned.
[13548.801682] XFS (dm-4): xfs_log_force: error 5 returned.
[13548.801685] XFS (dm-6): xfs_log_force: error 5 returned.
[13548.801687] XFS (dm-10): xfs_log_force: error 5 returned.
[13548.929407] XFS (dm-12): xfs_log_force: error 5 returned.
[13548.929410] XFS (dm-5): xfs_log_force: error 5 returned.
[13548.929559] XFS (dm-11): xfs_log_force: error 5 returned.
[13550.334333] XFS (dm-8): xfs_log_force: error 5 returned.
[13578.816143] XFS (dm-4): xfs_log_force: error 5 returned.
[13578.816146] XFS (dm-6): xfs_log_force: error 5 returned.
[13578.816147] XFS (dm-10): xfs_log_force: error 5 returned.
[13578.943861] XFS (dm-12): xfs_log_force: error 5 returned.
[13578.943863] XFS (dm-5): xfs_log_force: error 5 returned.
[13578.943929] XFS (dm-11): xfs_log_force: error 5 returned.
[13580.348796] XFS (dm-8): xfs_log_force: error 5 returned.
[13608.830595] XFS (dm-4): xfs_log_force: error 5 returned.
[13608.830598] XFS (dm-6): xfs_log_force: error 5 returned.
[13608.830599] XFS (dm-10): xfs_log_force: error 5 returned.
[13608.958318] XFS (dm-12): xfs_log_force: error 5 returned.
[13608.958322] XFS (dm-5): xfs_log_force: error 5 returned.
[13608.958495] XFS (dm-11): xfs_log_force: error 5 returned.
[13610.363254] XFS (dm-8): xfs_log_force: error 5 returned.
[13638.845053] XFS (dm-4): xfs_log_force: error 5 returned.
[13638.845056] XFS (dm-6): xfs_log_force: error 5 returned.
[13638.845058] XFS (dm-10): xfs_log_force: error 5 returned.
[13638.972778] XFS (dm-5): xfs_log_force: error 5 returned.
[13638.972780] XFS (dm-12): xfs_log_force: error 5 returned.
[13638.972784] XFS (dm-11): xfs_log_force: error 5 returned.
[13640.377715] XFS (dm-8): xfs_log_force: error 5 returned.
[13668.859524] XFS (dm-4): xfs_log_force: error 5 returned.
[13668.859526] XFS (dm-6): xfs_log_force: error 5 returned.
[13668.859528] XFS (dm-10): xfs_log_force: error 5 returned.
[13668.987234] XFS (dm-12): xfs_log_force: error 5 returned.
[13668.987236] XFS (dm-5): xfs_log_force: error 5 returned.
[13668.987257] XFS (dm-11): xfs_log_force: error 5 returned.
[13670.392168] XFS (dm-8): xfs_log_force: error 5 returned.
[13698.873972] XFS (dm-4): xfs_log_force: error 5 returned.
[13698.873975] XFS (dm-6): xfs_log_force: error 5 returned.
[13698.873977] XFS (dm-10): xfs_log_force: error 5 returned.
[13699.001697] XFS (dm-12): xfs_log_force: error 5 returned.
[13699.001699] XFS (dm-5): xfs_log_force: error 5 returned.
[13699.001721] XFS (dm-11): xfs_log_force: error 5 returned.
[13700.406631] XFS (dm-8): xfs_log_force: error 5 returned.
[13728.888433] XFS (dm-4): xfs_log_force: error 5 returned.
[13728.888436] XFS (dm-6): xfs_log_force: error 5 returned.
[13728.888437] XFS (dm-10): xfs_log_force: error 5 returned.
[13729.016152] XFS (dm-12): xfs_log_force: error 5 returned.
[13729.016154] XFS (dm-5): xfs_log_force: error 5 returned.
[13729.016189] XFS (dm-11): xfs_log_force: error 5 returned.
[13730.421083] XFS (dm-8): xfs_log_force: error 5 returned.
[13758.902885] XFS (dm-4): xfs_log_force: error 5 returned.
[13758.902888] XFS (dm-6): xfs_log_force: error 5 returned.
[13758.902890] XFS (dm-10): xfs_log_force: error 5 returned.
[13759.030611] XFS (dm-12): xfs_log_force: error 5 returned.
[13759.030613] XFS (dm-5): xfs_log_force: error 5 returned.
[13759.030618] XFS (dm-11): xfs_log_force: error 5 returned.
[13760.435541] XFS (dm-8): xfs_log_force: error 5 returned.
[13788.917344] XFS (dm-4): xfs_log_force: error 5 returned.
[13788.917346] XFS (dm-6): xfs_log_force: error 5 returned.
[13788.917348] XFS (dm-10): xfs_log_force: error 5 returned.
[13789.045068] XFS (dm-5): xfs_log_force: error 5 returned.
[13789.045071] XFS (dm-12): xfs_log_force: error 5 returned.
[13789.045126] XFS (dm-11): xfs_log_force: error 5 returned.
[13790.450005] XFS (dm-8): xfs_log_force: error 5 returned.
[13818.931815] XFS (dm-4): xfs_log_force: error 5 returned.
[13818.931818] XFS (dm-6): xfs_log_force: error 5 returned.
[13818.931820] XFS (dm-10): xfs_log_force: error 5 returned.
[13819.059526] XFS (dm-12): xfs_log_force: error 5 returned.
[13819.059528] XFS (dm-5): xfs_log_force: error 5 returned.
[13819.059564] XFS (dm-11): xfs_log_force: error 5 returned.
[13820.464460] XFS (dm-8): xfs_log_force: error 5 returned.
[13848.946267] XFS (dm-4): xfs_log_force: error 5 returned.
[13848.946269] XFS (dm-6): xfs_log_force: error 5 returned.
[13848.946271] XFS (dm-10): xfs_log_force: error 5 returned.
[13849.073991] XFS (dm-12): xfs_log_force: error 5 returned.
[13849.073994] XFS (dm-5): xfs_log_force: error 5 returned.
[13849.074066] XFS (dm-11): xfs_log_force: error 5 returned.
[13850.478923] XFS (dm-8): xfs_log_force: error 5 returned.
[13878.960730] XFS (dm-4): xfs_log_force: error 5 returned.
[13878.960733] XFS (dm-6): xfs_log_force: error 5 returned.
[13878.960735] XFS (dm-10): xfs_log_force: error 5 returned.
[13879.088448] XFS (dm-12): xfs_log_force: error 5 returned.
[13879.088450] XFS (dm-5): xfs_log_force: error 5 returned.
[13879.088515] XFS (dm-11): xfs_log_force: error 5 returned.
[13880.493382] XFS (dm-8): xfs_log_force: error 5 returned.
[13908.975183] XFS (dm-4): xfs_log_force: error 5 returned.
[13908.975185] XFS (dm-6): xfs_log_force: error 5 returned.
[13908.975187] XFS (dm-10): xfs_log_force: error 5 returned.
[13909.102909] XFS (dm-12): xfs_log_force: error 5 returned.
[13909.102911] XFS (dm-5): xfs_log_force: error 5 returned.
[13909.102923] XFS (dm-11): xfs_log_force: error 5 returned.
[13910.507838] XFS (dm-8): xfs_log_force: error 5 returned.
[13938.989642] XFS (dm-4): xfs_log_force: error 5 returned.
[13938.989645] XFS (dm-6): xfs_log_force: error 5 returned.
[13938.989647] XFS (dm-10): xfs_log_force: error 5 returned.
[13939.117366] XFS (dm-5): xfs_log_force: error 5 returned.
[13939.117369] XFS (dm-12): xfs_log_force: error 5 returned.
[13939.117372] XFS (dm-11): xfs_log_force: error 5 returned.
[13940.522300] XFS (dm-8): xfs_log_force: error 5 returned.
[13969.004108] XFS (dm-4): xfs_log_force: error 5 returned.
[13969.004110] XFS (dm-6): xfs_log_force: error 5 returned.
[13969.004112] XFS (dm-10): xfs_log_force: error 5 returned.
[13969.131823] XFS (dm-5): xfs_log_force: error 5 returned.
[13969.131826] XFS (dm-12): xfs_log_force: error 5 returned.
[13969.131847] XFS (dm-11): xfs_log_force: error 5 returned.
[13970.536758] XFS (dm-8): xfs_log_force: error 5 returned.
[13999.018563] XFS (dm-4): xfs_log_force: error 5 returned.
[13999.018565] XFS (dm-6): xfs_log_force: error 5 returned.
[13999.018567] XFS (dm-10): xfs_log_force: error 5 returned.
[13999.146285] XFS (dm-12): xfs_log_force: error 5 returned.
[13999.146288] XFS (dm-5): xfs_log_force: error 5 returned.
[13999.146323] XFS (dm-11): xfs_log_force: error 5 returned.
[14000.551221] XFS (dm-8): xfs_log_force: error 5 returned.
[14029.033024] XFS (dm-4): xfs_log_force: error 5 returned.
[14029.033026] XFS (dm-6): xfs_log_force: error 5 returned.
[14029.033028] XFS (dm-10): xfs_log_force: error 5 returned.
[14029.160742] XFS (dm-12): xfs_log_force: error 5 returned.
[14029.160745] XFS (dm-5): xfs_log_force: error 5 returned.
[14029.160761] XFS (dm-11): xfs_log_force: error 5 returned.
[14030.565673] XFS (dm-8): xfs_log_force: error 5 returned.
[14059.047480] XFS (dm-4): xfs_log_force: error 5 returned.
[14059.047483] XFS (dm-6): xfs_log_force: error 5 returned.
[14059.047486] XFS (dm-10): xfs_log_force: error 5 returned.
[14059.175203] XFS (dm-12): xfs_log_force: error 5 returned.
[14059.175206] XFS (dm-5): xfs_log_force: error 5 returned.
[14059.175273] XFS (dm-11): xfs_log_force: error 5 returned.
[14060.580132] XFS (dm-8): xfs_log_force: error 5 returned.
[14089.061936] XFS (dm-4): xfs_log_force: error 5 returned.
[14089.061939] XFS (dm-6): xfs_log_force: error 5 returned.
[14089.061941] XFS (dm-10): xfs_log_force: error 5 returned.
[14089.189660] XFS (dm-12): xfs_log_force: error 5 returned.
[14089.189662] XFS (dm-5): xfs_log_force: error 5 returned.
[14089.189716] XFS (dm-11): xfs_log_force: error 5 returned.
[14090.594597] XFS (dm-8): xfs_log_force: error 5 returned.
[14119.076400] XFS (dm-4): xfs_log_force: error 5 returned.
[14119.076402] XFS (dm-6): xfs_log_force: error 5 returned.
[14119.076404] XFS (dm-10): xfs_log_force: error 5 returned.
[14119.204115] XFS (dm-5): xfs_log_force: error 5 returned.
[14119.204118] XFS (dm-12): xfs_log_force: error 5 returned.
[14119.204212] XFS (dm-11): xfs_log_force: error 5 returned.
[14120.609051] XFS (dm-8): xfs_log_force: error 5 returned.
[14149.090854] XFS (dm-4): xfs_log_force: error 5 returned.
[14149.090857] XFS (dm-6): xfs_log_force: error 5 returned.
[14149.090859] XFS (dm-10): xfs_log_force: error 5 returned.
[14149.218579] XFS (dm-12): xfs_log_force: error 5 returned.
[14149.218582] XFS (dm-11): xfs_log_force: error 5 returned.
[14149.218584] XFS (dm-5): xfs_log_force: error 5 returned.
[14150.623507] XFS (dm-8): xfs_log_force: error 5 returned.
[14179.105317] XFS (dm-4): xfs_log_force: error 5 returned.
[14179.105320] XFS (dm-6): xfs_log_force: error 5 returned.
[14179.105322] XFS (dm-10): xfs_log_force: error 5 returned.
[14179.233034] XFS (dm-12): xfs_log_force: error 5 returned.
[14179.233036] XFS (dm-5): xfs_log_force: error 5 returned.
[14179.233044] XFS (dm-11): xfs_log_force: error 5 returned.
[14180.637968] XFS (dm-8): xfs_log_force: error 5 returned.
[14209.119768] XFS (dm-4): xfs_log_force: error 5 returned.
[14209.119770] XFS (dm-6): xfs_log_force: error 5 returned.
[14209.119772] XFS (dm-10): xfs_log_force: error 5 returned.
[14209.247492] XFS (dm-12): xfs_log_force: error 5 returned.
[14209.247495] XFS (dm-5): xfs_log_force: error 5 returned.
[14209.247520] XFS (dm-11): xfs_log_force: error 5 returned.
[14210.652422] XFS (dm-8): xfs_log_force: error 5 returned.
[14239.134226] XFS (dm-4): xfs_log_force: error 5 returned.
[14239.134229] XFS (dm-6): xfs_log_force: error 5 returned.
[14239.134231] XFS (dm-10): xfs_log_force: error 5 returned.
[14239.261950] XFS (dm-5): xfs_log_force: error 5 returned.
[14239.261953] XFS (dm-12): xfs_log_force: error 5 returned.
[14239.261984] XFS (dm-11): xfs_log_force: error 5 returned.
[14240.666886] XFS (dm-8): xfs_log_force: error 5 returned.
[14269.148698] XFS (dm-4): xfs_log_force: error 5 returned.
[14269.148702] XFS (dm-6): xfs_log_force: error 5 returned.
[14269.148704] XFS (dm-10): xfs_log_force: error 5 returned.
[14269.276407] XFS (dm-12): xfs_log_force: error 5 returned.
[14269.276410] XFS (dm-5): xfs_log_force: error 5 returned.
[14269.276453] XFS (dm-11): xfs_log_force: error 5 returned.
[14270.681342] XFS (dm-8): xfs_log_force: error 5 returned.
[14299.163152] XFS (dm-4): xfs_log_force: error 5 returned.
[14299.163155] XFS (dm-6): xfs_log_force: error 5 returned.
[14299.163157] XFS (dm-10): xfs_log_force: error 5 returned.
[14299.290871] XFS (dm-12): xfs_log_force: error 5 returned.
[14299.290878] XFS (dm-5): xfs_log_force: error 5 returned.
[14299.290890] XFS (dm-11): xfs_log_force: error 5 returned.
[14300.695797] XFS (dm-8): xfs_log_force: error 5 returned.
[14329.177608] XFS (dm-4): xfs_log_force: error 5 returned.
[14329.177610] XFS (dm-6): xfs_log_force: error 5 returned.
[14329.177612] XFS (dm-10): xfs_log_force: error 5 returned.
[14329.305326] XFS (dm-12): xfs_log_force: error 5 returned.
[14329.305328] XFS (dm-5): xfs_log_force: error 5 returned.
[14329.305398] XFS (dm-11): xfs_log_force: error 5 returned.
[14330.710260] XFS (dm-8): xfs_log_force: error 5 returned.
[14359.192061] XFS (dm-4): xfs_log_force: error 5 returned.
[14359.192064] XFS (dm-6): xfs_log_force: error 5 returned.
[14359.192066] XFS (dm-10): xfs_log_force: error 5 returned.
[14359.319783] XFS (dm-12): xfs_log_force: error 5 returned.
[14359.319787] XFS (dm-5): xfs_log_force: error 5 returned.
[14359.319790] XFS (dm-11): xfs_log_force: error 5 returned.
[14360.724718] XFS (dm-8): xfs_log_force: error 5 returned.
[14389.206519] XFS (dm-4): xfs_log_force: error 5 returned.
[14389.206522] XFS (dm-6): xfs_log_force: error 5 returned.
[14389.206524] XFS (dm-10): xfs_log_force: error 5 returned.
[14389.334242] XFS (dm-5): xfs_log_force: error 5 returned.
[14389.334245] XFS (dm-12): xfs_log_force: error 5 returned.
[14389.334254] XFS (dm-11): xfs_log_force: error 5 returned.
[14390.739179] XFS (dm-8): xfs_log_force: error 5 returned.
[14419.220982] XFS (dm-4): xfs_log_force: error 5 returned.
[14419.220984] XFS (dm-6): xfs_log_force: error 5 returned.
[14419.220986] XFS (dm-10): xfs_log_force: error 5 returned.
[14419.348697] XFS (dm-5): xfs_log_force: error 5 returned.
[14419.348699] XFS (dm-12): xfs_log_force: error 5 returned.
[14419.348732] XFS (dm-11): xfs_log_force: error 5 returned.
[14420.753632] XFS (dm-8): xfs_log_force: error 5 returned.
[14449.235440] XFS (dm-4): xfs_log_force: error 5 returned.
[14449.235442] XFS (dm-6): xfs_log_force: error 5 returned.
[14449.235445] XFS (dm-10): xfs_log_force: error 5 returned.
[14449.363159] XFS (dm-12): xfs_log_force: error 5 returned.
[14449.363165] XFS (dm-5): xfs_log_force: error 5 returned.
[14449.363205] XFS (dm-11): xfs_log_force: error 5 returned.
[14450.768094] XFS (dm-8): xfs_log_force: error 5 returned.
[14479.249893] XFS (dm-4): xfs_log_force: error 5 returned.
[14479.249896] XFS (dm-6): xfs_log_force: error 5 returned.
[14479.249898] XFS (dm-10): xfs_log_force: error 5 returned.
[14479.377612] XFS (dm-12): xfs_log_force: error 5 returned.
[14479.377614] XFS (dm-5): xfs_log_force: error 5 returned.
[14479.377665] XFS (dm-11): xfs_log_force: error 5 returned.
[14480.782551] XFS (dm-8): xfs_log_force: error 5 returned.
[14509.264344] XFS (dm-4): xfs_log_force: error 5 returned.
[14509.264347] XFS (dm-6): xfs_log_force: error 5 returned.
[14509.264349] XFS (dm-10): xfs_log_force: error 5 returned.
[14509.392069] XFS (dm-12): xfs_log_force: error 5 returned.
[14509.392071] XFS (dm-5): xfs_log_force: error 5 returned.
[14509.392079] XFS (dm-11): xfs_log_force: error 5 returned.
[14510.796998] XFS (dm-8): xfs_log_force: error 5 returned.
[14539.278801] XFS (dm-4): xfs_log_force: error 5 returned.
[14539.278804] XFS (dm-6): xfs_log_force: error 5 returned.
[14539.278806] XFS (dm-10): xfs_log_force: error 5 returned.
[14539.406524] XFS (dm-5): xfs_log_force: error 5 returned.
[14539.406528] XFS (dm-12): xfs_log_force: error 5 returned.
[14539.406606] XFS (dm-11): xfs_log_force: error 5 returned.
[14540.811461] XFS (dm-8): xfs_log_force: error 5 returned.
[14569.293264] XFS (dm-4): xfs_log_force: error 5 returned.
[14569.293267] XFS (dm-6): xfs_log_force: error 5 returned.
[14569.293269] XFS (dm-10): xfs_log_force: error 5 returned.
[14569.420980] XFS (dm-5): xfs_log_force: error 5 returned.
[14569.420982] XFS (dm-12): xfs_log_force: error 5 returned.
[14569.421020] XFS (dm-11): xfs_log_force: error 5 returned.
[14570.825915] XFS (dm-8): xfs_log_force: error 5 returned.
[14599.307725] XFS (dm-4): xfs_log_force: error 5 returned.
[14599.307728] XFS (dm-6): xfs_log_force: error 5 returned.
[14599.307729] XFS (dm-10): xfs_log_force: error 5 returned.
[14599.435443] XFS (dm-12): xfs_log_force: error 5 returned.
[14599.435450] XFS (dm-5): xfs_log_force: error 5 returned.
[14599.435548] XFS (dm-11): xfs_log_force: error 5 returned.
[14600.840375] XFS (dm-8): xfs_log_force: error 5 returned.
[14629.322179] XFS (dm-4): xfs_log_force: error 5 returned.
[14629.322182] XFS (dm-6): xfs_log_force: error 5 returned.
[14629.322184] XFS (dm-10): xfs_log_force: error 5 returned.
[14629.449897] XFS (dm-12): xfs_log_force: error 5 returned.
[14629.449900] XFS (dm-5): xfs_log_force: error 5 returned.
[14629.449966] XFS (dm-11): xfs_log_force: error 5 returned.
[14630.854831] XFS (dm-8): xfs_log_force: error 5 returned.
[14659.336632] XFS (dm-4): xfs_log_force: error 5 returned.
[14659.336634] XFS (dm-6): xfs_log_force: error 5 returned.
[14659.336636] XFS (dm-10): xfs_log_force: error 5 returned.
[14659.464357] XFS (dm-12): xfs_log_force: error 5 returned.
[14659.464359] XFS (dm-5): xfs_log_force: error 5 returned.
[14659.464527] XFS (dm-11): xfs_log_force: error 5 returned.
[14660.869290] XFS (dm-8): xfs_log_force: error 5 returned.
[14689.351091] XFS (dm-4): xfs_log_force: error 5 returned.
[14689.351093] XFS (dm-6): xfs_log_force: error 5 returned.
[14689.351095] XFS (dm-10): xfs_log_force: error 5 returned.
[14689.478815] XFS (dm-5): xfs_log_force: error 5 returned.
[14689.478818] XFS (dm-12): xfs_log_force: error 5 returned.
[14689.478823] XFS (dm-11): xfs_log_force: error 5 returned.
[14690.883748] XFS (dm-8): xfs_log_force: error 5 returned.
[14719.365563] XFS (dm-4): xfs_log_force: error 5 returned.
[14719.365566] XFS (dm-6): xfs_log_force: error 5 returned.
[14719.365568] XFS (dm-10): xfs_log_force: error 5 returned.
[14719.493273] XFS (dm-12): xfs_log_force: error 5 returned.
[14719.493275] XFS (dm-5): xfs_log_force: error 5 returned.
[14719.493285] XFS (dm-11): xfs_log_force: error 5 returned.
[14720.898207] XFS (dm-8): xfs_log_force: error 5 returned.
[14749.380019] XFS (dm-4): xfs_log_force: error 5 returned.
[14749.380021] XFS (dm-6): xfs_log_force: error 5 returned.
[14749.380023] XFS (dm-10): xfs_log_force: error 5 returned.
[14749.507735] XFS (dm-12): xfs_log_force: error 5 returned.
[14749.507743] XFS (dm-5): xfs_log_force: error 5 returned.
[14749.507755] XFS (dm-11): xfs_log_force: error 5 returned.
[14750.912671] XFS (dm-8): xfs_log_force: error 5 returned.
[14779.394473] XFS (dm-4): xfs_log_force: error 5 returned.
[14779.394476] XFS (dm-6): xfs_log_force: error 5 returned.
[14779.394478] XFS (dm-10): xfs_log_force: error 5 returned.
[14779.522193] XFS (dm-12): xfs_log_force: error 5 returned.
[14779.522199] XFS (dm-5): xfs_log_force: error 5 returned.
[14779.522210] XFS (dm-11): xfs_log_force: error 5 returned.
[14780.927123] XFS (dm-8): xfs_log_force: error 5 returned.
[14809.408927] XFS (dm-4): xfs_log_force: error 5 returned.
[14809.408930] XFS (dm-6): xfs_log_force: error 5 returned.
[14809.408932] XFS (dm-10): xfs_log_force: error 5 returned.
[14809.536653] XFS (dm-12): xfs_log_force: error 5 returned.
[14809.536655] XFS (dm-5): xfs_log_force: error 5 returned.
[14809.536703] XFS (dm-11): xfs_log_force: error 5 returned.
[14810.941582] XFS (dm-8): xfs_log_force: error 5 returned.
[14839.423386] XFS (dm-4): xfs_log_force: error 5 returned.
[14839.423389] XFS (dm-6): xfs_log_force: error 5 returned.
[14839.423390] XFS (dm-10): xfs_log_force: error 5 returned.
[14839.551110] XFS (dm-12): xfs_log_force: error 5 returned.
[14839.551112] XFS (dm-5): xfs_log_force: error 5 returned.
[14839.551156] XFS (dm-11): xfs_log_force: error 5 returned.
[14840.956047] XFS (dm-8): xfs_log_force: error 5 returned.
[14869.437858] XFS (dm-4): xfs_log_force: error 5 returned.
[14869.437860] XFS (dm-6): xfs_log_force: error 5 returned.
[14869.437862] XFS (dm-10): xfs_log_force: error 5 returned.
[14869.565568] XFS (dm-12): xfs_log_force: error 5 returned.
[14869.565570] XFS (dm-5): xfs_log_force: error 5 returned.
[14869.565631] XFS (dm-11): xfs_log_force: error 5 returned.
[14870.970502] XFS (dm-8): xfs_log_force: error 5 returned.
[14899.452314] XFS (dm-4): xfs_log_force: error 5 returned.
[14899.452317] XFS (dm-6): xfs_log_force: error 5 returned.
[14899.452319] XFS (dm-10): xfs_log_force: error 5 returned.
[14899.580032] XFS (dm-12): xfs_log_force: error 5 returned.
[14899.580039] XFS (dm-5): xfs_log_force: error 5 returned.
[14899.580099] XFS (dm-11): xfs_log_force: error 5 returned.
[14900.984960] XFS (dm-8): xfs_log_force: error 5 returned.
[14929.466771] XFS (dm-4): xfs_log_force: error 5 returned.
[14929.466773] XFS (dm-6): xfs_log_force: error 5 returned.
[14929.466775] XFS (dm-10): xfs_log_force: error 5 returned.
[14929.594489] XFS (dm-12): xfs_log_force: error 5 returned.
[14929.594496] XFS (dm-5): xfs_log_force: error 5 returned.
[14929.594580] XFS (dm-11): xfs_log_force: error 5 returned.
[14930.999423] XFS (dm-8): xfs_log_force: error 5 returned.
[14959.481225] XFS (dm-4): xfs_log_force: error 5 returned.
[14959.481228] XFS (dm-6): xfs_log_force: error 5 returned.
[14959.481230] XFS (dm-10): xfs_log_force: error 5 returned.
[14959.608949] XFS (dm-12): xfs_log_force: error 5 returned.
[14959.608952] XFS (dm-5): xfs_log_force: error 5 returned.
[14959.609040] XFS (dm-11): xfs_log_force: error 5 returned.
[14961.013880] XFS (dm-8): xfs_log_force: error 5 returned.
[14989.495685] XFS (dm-4): xfs_log_force: error 5 returned.
[14989.495688] XFS (dm-6): xfs_log_force: error 5 returned.
[14989.495690] XFS (dm-10): xfs_log_force: error 5 returned.
[14989.623409] XFS (dm-5): xfs_log_force: error 5 returned.
[14989.623412] XFS (dm-12): xfs_log_force: error 5 returned.
[14989.623511] XFS (dm-11): xfs_log_force: error 5 returned.
[14991.028345] XFS (dm-8): xfs_log_force: error 5 returned.
[15019.510157] XFS (dm-4): xfs_log_force: error 5 returned.
[15019.510159] XFS (dm-6): xfs_log_force: error 5 returned.
[15019.510161] XFS (dm-10): xfs_log_force: error 5 returned.
[15019.637868] XFS (dm-12): xfs_log_force: error 5 returned.
[15019.637870] XFS (dm-5): xfs_log_force: error 5 returned.
[15019.637978] XFS (dm-11): xfs_log_force: error 5 returned.
[15021.042801] XFS (dm-8): xfs_log_force: error 5 returned.
[15049.524612] XFS (dm-4): xfs_log_force: error 5 returned.
[15049.524615] XFS (dm-6): xfs_log_force: error 5 returned.
[15049.524617] XFS (dm-10): xfs_log_force: error 5 returned.
[15049.652331] XFS (dm-12): xfs_log_force: error 5 returned.
[15049.652337] XFS (dm-5): xfs_log_force: error 5 returned.
[15049.652349] XFS (dm-11): xfs_log_force: error 5 returned.
[15051.057257] XFS (dm-8): xfs_log_force: error 5 returned.
[15079.539067] XFS (dm-4): xfs_log_force: error 5 returned.
[15079.539070] XFS (dm-6): xfs_log_force: error 5 returned.
[15079.539072] XFS (dm-10): xfs_log_force: error 5 returned.
[15079.666786] XFS (dm-12): xfs_log_force: error 5 returned.
[15079.666792] XFS (dm-5): xfs_log_force: error 5 returned.
[15079.666843] XFS (dm-11): xfs_log_force: error 5 returned.
[15081.071720] XFS (dm-8): xfs_log_force: error 5 returned.
[15109.553520] XFS (dm-4): xfs_log_force: error 5 returned.
[15109.553523] XFS (dm-6): xfs_log_force: error 5 returned.
[15109.553525] XFS (dm-10): xfs_log_force: error 5 returned.
[15109.681243] XFS (dm-12): xfs_log_force: error 5 returned.
[15109.681247] XFS (dm-5): xfs_log_force: error 5 returned.
[15109.681294] XFS (dm-11): xfs_log_force: error 5 returned.
[15111.086179] XFS (dm-8): xfs_log_force: error 5 returned.
[15139.567978] XFS (dm-4): xfs_log_force: error 5 returned.
[15139.567981] XFS (dm-6): xfs_log_force: error 5 returned.
[15139.567983] XFS (dm-10): xfs_log_force: error 5 returned.
[15139.695703] XFS (dm-5): xfs_log_force: error 5 returned.
[15139.695705] XFS (dm-12): xfs_log_force: error 5 returned.
[15139.695716] XFS (dm-11): xfs_log_force: error 5 returned.
[15141.100646] XFS (dm-8): xfs_log_force: error 5 returned.
[15169.582442] XFS (dm-4): xfs_log_force: error 5 returned.
[15169.582445] XFS (dm-6): xfs_log_force: error 5 returned.
[15169.582447] XFS (dm-10): xfs_log_force: error 5 returned.
[15169.710158] XFS (dm-5): xfs_log_force: error 5 returned.
[15169.710169] XFS (dm-12): xfs_log_force: error 5 returned.
[15169.710185] XFS (dm-11): xfs_log_force: error 5 returned.
[15171.115093] XFS (dm-8): xfs_log_force: error 5 returned.
[15199.596902] XFS (dm-4): xfs_log_force: error 5 returned.
[15199.596905] XFS (dm-6): xfs_log_force: error 5 returned.
[15199.596907] XFS (dm-10): xfs_log_force: error 5 returned.
[15199.724618] XFS (dm-12): xfs_log_force: error 5 returned.
[15199.724628] XFS (dm-5): xfs_log_force: error 5 returned.
[15199.724665] XFS (dm-11): xfs_log_force: error 5 returned.
[15201.129555] XFS (dm-8): xfs_log_force: error 5 returned.
[15229.611357] XFS (dm-4): xfs_log_force: error 5 returned.
[15229.611360] XFS (dm-6): xfs_log_force: error 5 returned.
[15229.611362] XFS (dm-10): xfs_log_force: error 5 returned.
[15229.739075] XFS (dm-12): xfs_log_force: error 5 returned.
[15229.739082] XFS (dm-11): xfs_log_force: error 5 returned.
[15229.739085] XFS (dm-5): xfs_log_force: error 5 returned.
[15231.144007] XFS (dm-8): xfs_log_force: error 5 returned.
[15259.625809] XFS (dm-4): xfs_log_force: error 5 returned.
[15259.625811] XFS (dm-6): xfs_log_force: error 5 returned.
[15259.625813] XFS (dm-10): xfs_log_force: error 5 returned.
[15259.753534] XFS (dm-12): xfs_log_force: error 5 returned.
[15259.753536] XFS (dm-5): xfs_log_force: error 5 returned.
[15259.753543] XFS (dm-11): xfs_log_force: error 5 returned.
[15261.158463] XFS (dm-8): xfs_log_force: error 5 returned.
[15289.640266] XFS (dm-4): xfs_log_force: error 5 returned.
[15289.640268] XFS (dm-6): xfs_log_force: error 5 returned.
[15289.640270] XFS (dm-10): xfs_log_force: error 5 returned.
[15289.767990] XFS (dm-5): xfs_log_force: error 5 returned.
[15289.767993] XFS (dm-12): xfs_log_force: error 5 returned.
[15289.768017] XFS (dm-11): xfs_log_force: error 5 returned.
[15291.172927] XFS (dm-8): xfs_log_force: error 5 returned.
[15319.654736] XFS (dm-4): xfs_log_force: error 5 returned.
[15319.654739] XFS (dm-6): xfs_log_force: error 5 returned.
[15319.654741] XFS (dm-10): xfs_log_force: error 5 returned.
[15319.782447] XFS (dm-12): xfs_log_force: error 5 returned.
[15319.782450] XFS (dm-5): xfs_log_force: error 5 returned.
[15319.782484] XFS (dm-11): xfs_log_force: error 5 returned.
[15321.187380] XFS (dm-8): xfs_log_force: error 5 returned.
[15349.669189] XFS (dm-4): xfs_log_force: error 5 returned.
[15349.669191] XFS (dm-6): xfs_log_force: error 5 returned.
[15349.669193] XFS (dm-10): xfs_log_force: error 5 returned.
[15349.796907] XFS (dm-12): xfs_log_force: error 5 returned.
[15349.796914] XFS (dm-5): xfs_log_force: error 5 returned.
[15349.796950] XFS (dm-11): xfs_log_force: error 5 returned.
[15351.201840] XFS (dm-8): xfs_log_force: error 5 returned.
[15379.683644] XFS (dm-4): xfs_log_force: error 5 returned.
[15379.683647] XFS (dm-6): xfs_log_force: error 5 returned.
[15379.683649] XFS (dm-10): xfs_log_force: error 5 returned.
[15379.811362] XFS (dm-12): xfs_log_force: error 5 returned.
[15379.811369] XFS (dm-5): xfs_log_force: error 5 returned.
[15379.811425] XFS (dm-11): xfs_log_force: error 5 returned.
[15381.216295] XFS (dm-8): xfs_log_force: error 5 returned.
[15409.698094] XFS (dm-4): xfs_log_force: error 5 returned.
[15409.698097] XFS (dm-6): xfs_log_force: error 5 returned.
[15409.698099] XFS (dm-10): xfs_log_force: error 5 returned.
[15409.825820] XFS (dm-12): xfs_log_force: error 5 returned.
[15409.825822] XFS (dm-5): xfs_log_force: error 5 returned.
[15409.825885] XFS (dm-11): xfs_log_force: error 5 returned.
[15411.230749] XFS (dm-8): xfs_log_force: error 5 returned.
[15439.712552] XFS (dm-4): xfs_log_force: error 5 returned.
[15439.712554] XFS (dm-6): xfs_log_force: error 5 returned.
[15439.712556] XFS (dm-10): xfs_log_force: error 5 returned.
[15439.840276] XFS (dm-5): xfs_log_force: error 5 returned.
[15439.840279] XFS (dm-12): xfs_log_force: error 5 returned.
[15439.840298] XFS (dm-11): xfs_log_force: error 5 returned.
[15441.245209] XFS (dm-8): xfs_log_force: error 5 returned.
[15469.727023] XFS (dm-4): xfs_log_force: error 5 returned.
[15469.727025] XFS (dm-6): xfs_log_force: error 5 returned.
[15469.727027] XFS (dm-10): xfs_log_force: error 5 returned.
[15469.854733] XFS (dm-12): xfs_log_force: error 5 returned.
[15469.854735] XFS (dm-5): xfs_log_force: error 5 returned.
[15469.854778] XFS (dm-11): xfs_log_force: error 5 returned.
[15471.259666] XFS (dm-8): xfs_log_force: error 5 returned.
[15499.741477] XFS (dm-4): xfs_log_force: error 5 returned.
[15499.741480] XFS (dm-6): xfs_log_force: error 5 returned.
[15499.741482] XFS (dm-10): xfs_log_force: error 5 returned.
[15499.869193] XFS (dm-12): xfs_log_force: error 5 returned.
[15499.869201] XFS (dm-5): xfs_log_force: error 5 returned.
[15499.869248] XFS (dm-11): xfs_log_force: error 5 returned.
[15501.274129] XFS (dm-8): xfs_log_force: error 5 returned.
[15529.755931] XFS (dm-4): xfs_log_force: error 5 returned.
[15529.755934] XFS (dm-6): xfs_log_force: error 5 returned.
[15529.755936] XFS (dm-10): xfs_log_force: error 5 returned.
[15529.883650] XFS (dm-12): xfs_log_force: error 5 returned.
[15529.883656] XFS (dm-5): xfs_log_force: error 5 returned.
[15529.883668] XFS (dm-11): xfs_log_force: error 5 returned.
[15531.288580] XFS (dm-8): xfs_log_force: error 5 returned.
[15559.770383] XFS (dm-4): xfs_log_force: error 5 returned.
[15559.770385] XFS (dm-6): xfs_log_force: error 5 returned.
[15559.770387] XFS (dm-10): xfs_log_force: error 5 returned.
[15559.898108] XFS (dm-12): xfs_log_force: error 5 returned.
[15559.898111] XFS (dm-5): xfs_log_force: error 5 returned.
[15559.898174] XFS (dm-11): xfs_log_force: error 5 returned.
[15561.303040] XFS (dm-8): xfs_log_force: error 5 returned.
[15589.784841] XFS (dm-4): xfs_log_force: error 5 returned.
[15589.784843] XFS (dm-6): xfs_log_force: error 5 returned.
[15589.784845] XFS (dm-10): xfs_log_force: error 5 returned.
[15589.912565] XFS (dm-12): xfs_log_force: error 5 returned.
[15589.912567] XFS (dm-5): xfs_log_force: error 5 returned.
[15589.912612] XFS (dm-11): xfs_log_force: error 5 returned.
[15591.317501] XFS (dm-8): xfs_log_force: error 5 returned.
[15619.799305] XFS (dm-4): xfs_log_force: error 5 returned.
[15619.799308] XFS (dm-6): xfs_log_force: error 5 returned.
[15619.799310] XFS (dm-10): xfs_log_force: error 5 returned.
[15619.927021] XFS (dm-5): xfs_log_force: error 5 returned.
[15619.927025] XFS (dm-12): xfs_log_force: error 5 returned.
[15619.927157] XFS (dm-11): xfs_log_force: error 5 returned.
[15621.331957] XFS (dm-8): xfs_log_force: error 5 returned.
[15649.813766] XFS (dm-4): xfs_log_force: error 5 returned.
[15649.813769] XFS (dm-6): xfs_log_force: error 5 returned.
[15649.813771] XFS (dm-10): xfs_log_force: error 5 returned.
[15649.941486] XFS (dm-12): xfs_log_force: error 5 returned.
[15649.941489] XFS (dm-11): xfs_log_force: error 5 returned.
[15649.941491] XFS (dm-5): xfs_log_force: error 5 returned.
[15651.346412] XFS (dm-8): xfs_log_force: error 5 returned.
[15679.828221] XFS (dm-4): xfs_log_force: error 5 returned.
[15679.828224] XFS (dm-6): xfs_log_force: error 5 returned.
[15679.828226] XFS (dm-10): xfs_log_force: error 5 returned.
[15679.955940] XFS (dm-12): xfs_log_force: error 5 returned.
[15679.955943] XFS (dm-11): xfs_log_force: error 5 returned.
[15679.955945] XFS (dm-5): xfs_log_force: error 5 returned.
[15681.360872] XFS (dm-8): xfs_log_force: error 5 returned.
[15709.842673] XFS (dm-4): xfs_log_force: error 5 returned.
[15709.842676] XFS (dm-6): xfs_log_force: error 5 returned.
[15709.842678] XFS (dm-10): xfs_log_force: error 5 returned.
[15709.970397] XFS (dm-12): xfs_log_force: error 5 returned.
[15709.970400] XFS (dm-5): xfs_log_force: error 5 returned.
[15709.970500] XFS (dm-11): xfs_log_force: error 5 returned.
[15711.375330] XFS (dm-8): xfs_log_force: error 5 returned.
[15739.857131] XFS (dm-4): xfs_log_force: error 5 returned.
[15739.857134] XFS (dm-6): xfs_log_force: error 5 returned.
[15739.857136] XFS (dm-10): xfs_log_force: error 5 returned.
[15739.984856] XFS (dm-5): xfs_log_force: error 5 returned.
[15739.984858] XFS (dm-12): xfs_log_force: error 5 returned.
[15739.985032] XFS (dm-11): xfs_log_force: error 5 returned.
[15741.389791] XFS (dm-8): xfs_log_force: error 5 returned.
[15769.871602] XFS (dm-4): xfs_log_force: error 5 returned.
[15769.871605] XFS (dm-6): xfs_log_force: error 5 returned.
[15769.871607] XFS (dm-10): xfs_log_force: error 5 returned.
[15769.999313] XFS (dm-12): xfs_log_force: error 5 returned.
[15769.999315] XFS (dm-5): xfs_log_force: error 5 returned.
[15769.999323] XFS (dm-11): xfs_log_force: error 5 returned.
[15771.404246] XFS (dm-8): xfs_log_force: error 5 returned.
[15799.886055] XFS (dm-4): xfs_log_force: error 5 returned.
[15799.886058] XFS (dm-6): xfs_log_force: error 5 returned.
[15799.886060] XFS (dm-10): xfs_log_force: error 5 returned.
[15800.013775] XFS (dm-12): xfs_log_force: error 5 returned.
[15800.013781] XFS (dm-5): xfs_log_force: error 5 returned.
[15800.013789] XFS (dm-11): xfs_log_force: error 5 returned.
[15801.418701] XFS (dm-8): xfs_log_force: error 5 returned.
[15829.900508] XFS (dm-4): xfs_log_force: error 5 returned.
[15829.900512] XFS (dm-6): xfs_log_force: error 5 returned.
[15829.900515] XFS (dm-10): xfs_log_force: error 5 returned.
[15830.028226] XFS (dm-5): xfs_log_force: error 5 returned.
[15830.028229] XFS (dm-12): xfs_log_force: error 5 returned.
[15830.028235] XFS (dm-11): xfs_log_force: error 5 returned.
[15831.433163] XFS (dm-8): xfs_log_force: error 5 returned.
[15859.914964] XFS (dm-4): xfs_log_force: error 5 returned.
[15859.914968] XFS (dm-6): xfs_log_force: error 5 returned.
[15859.914971] XFS (dm-10): xfs_log_force: error 5 returned.
[15860.042681] XFS (dm-5): xfs_log_force: error 5 returned.
[15860.042684] XFS (dm-12): xfs_log_force: error 5 returned.
[15860.042705] XFS (dm-11): xfs_log_force: error 5 returned.
[15861.447621] XFS (dm-8): xfs_log_force: error 5 returned.
[15889.929419] XFS (dm-4): xfs_log_force: error 5 returned.
[15889.929423] XFS (dm-6): xfs_log_force: error 5 returned.
[15889.929425] XFS (dm-10): xfs_log_force: error 5 returned.
[15890.057138] XFS (dm-5): xfs_log_force: error 5 returned.
[15890.057142] XFS (dm-12): xfs_log_force: error 5 returned.
[15890.057176] XFS (dm-11): xfs_log_force: error 5 returned.
[15891.462075] XFS (dm-8): xfs_log_force: error 5 returned.
[15919.943885] XFS (dm-4): xfs_log_force: error 5 returned.
[15919.943889] XFS (dm-6): xfs_log_force: error 5 returned.
[15919.943892] XFS (dm-10): xfs_log_force: error 5 returned.
[15920.071597] XFS (dm-5): xfs_log_force: error 5 returned.
[15920.071604] XFS (dm-12): xfs_log_force: error 5 returned.
[15920.071646] XFS (dm-11): xfs_log_force: error 5 returned.
[15921.476529] XFS (dm-8): xfs_log_force: error 5 returned.
[15949.958337] XFS (dm-4): xfs_log_force: error 5 returned.
[15949.958340] XFS (dm-6): xfs_log_force: error 5 returned.
[15949.958342] XFS (dm-10): xfs_log_force: error 5 returned.
[15950.086056] XFS (dm-12): xfs_log_force: error 5 returned.
[15950.086061] XFS (dm-5): xfs_log_force: error 5 returned.
[15950.086116] XFS (dm-11): xfs_log_force: error 5 returned.
[15951.490989] XFS (dm-8): xfs_log_force: error 5 returned.
[15979.972790] XFS (dm-4): xfs_log_force: error 5 returned.
[15979.972793] XFS (dm-6): xfs_log_force: error 5 returned.
[15979.972795] XFS (dm-10): xfs_log_force: error 5 returned.
[15980.100510] XFS (dm-12): xfs_log_force: error 5 returned.
[15980.100514] XFS (dm-5): xfs_log_force: error 5 returned.
[15980.100583] XFS (dm-11): xfs_log_force: error 5 returned.
[15981.505441] XFS (dm-8): xfs_log_force: error 5 returned.
[16009.987242] XFS (dm-4): xfs_log_force: error 5 returned.
[16009.987245] XFS (dm-6): xfs_log_force: error 5 returned.
[16009.987247] XFS (dm-10): xfs_log_force: error 5 returned.
[16010.114968] XFS (dm-12): xfs_log_force: error 5 returned.
[16010.114970] XFS (dm-5): xfs_log_force: error 5 returned.
[16010.114977] XFS (dm-11): xfs_log_force: error 5 returned.
[16011.519899] XFS (dm-8): xfs_log_force: error 5 returned.
[16040.001700] XFS (dm-4): xfs_log_force: error 5 returned.
[16040.001702] XFS (dm-6): xfs_log_force: error 5 returned.
[16040.001704] XFS (dm-10): xfs_log_force: error 5 returned.
[16040.129423] XFS (dm-5): xfs_log_force: error 5 returned.
[16040.129425] XFS (dm-11): xfs_log_force: error 5 returned.
[16040.129435] XFS (dm-12): xfs_log_force: error 5 returned.
[16041.534363] XFS (dm-8): xfs_log_force: error 5 returned.
[16070.016172] XFS (dm-4): xfs_log_force: error 5 returned.
[16070.016176] XFS (dm-6): xfs_log_force: error 5 returned.
[16070.016178] XFS (dm-10): xfs_log_force: error 5 returned.
[16070.143883] XFS (dm-12): xfs_log_force: error 5 returned.
[16070.143886] XFS (dm-5): xfs_log_force: error 5 returned.
[16070.143920] XFS (dm-11): xfs_log_force: error 5 returned.
[16071.548814] XFS (dm-8): xfs_log_force: error 5 returned.
[16100.030626] XFS (dm-4): xfs_log_force: error 5 returned.
[16100.030631] XFS (dm-6): xfs_log_force: error 5 returned.
[16100.030634] XFS (dm-10): xfs_log_force: error 5 returned.
[16100.158339] XFS (dm-12): xfs_log_force: error 5 returned.
[16100.158342] XFS (dm-5): xfs_log_force: error 5 returned.
[16100.158345] XFS (dm-11): xfs_log_force: error 5 returned.
[16101.563272] XFS (dm-8): xfs_log_force: error 5 returned.
[16130.045081] XFS (dm-4): xfs_log_force: error 5 returned.
[16130.045086] XFS (dm-6): xfs_log_force: error 5 returned.
[16130.045088] XFS (dm-10): xfs_log_force: error 5 returned.
[16130.172797] XFS (dm-12): xfs_log_force: error 5 returned.
[16130.172806] XFS (dm-11): xfs_log_force: error 5 returned.
[16130.172808] XFS (dm-5): xfs_log_force: error 5 returned.
[16131.577730] XFS (dm-8): xfs_log_force: error 5 returned.
[16160.059534] XFS (dm-4): xfs_log_force: error 5 returned.
[16160.059538] XFS (dm-6): xfs_log_force: error 5 returned.
[16160.059541] XFS (dm-10): xfs_log_force: error 5 returned.
[16160.187255] XFS (dm-12): xfs_log_force: error 5 returned.
[16160.187258] XFS (dm-5): xfs_log_force: error 5 returned.
[16160.187389] XFS (dm-11): xfs_log_force: error 5 returned.
[16161.592187] XFS (dm-8): xfs_log_force: error 5 returned.
[16190.073992] XFS (dm-4): xfs_log_force: error 5 returned.
[16190.073996] XFS (dm-6): xfs_log_force: error 5 returned.
[16190.073999] XFS (dm-10): xfs_log_force: error 5 returned.
[16190.201713] XFS (dm-5): xfs_log_force: error 5 returned.
[16190.201719] XFS (dm-12): xfs_log_force: error 5 returned.
[16190.201736] XFS (dm-11): xfs_log_force: error 5 returned.
[16191.606647] XFS (dm-8): xfs_log_force: error 5 returned.
[16220.088459] XFS (dm-4): xfs_log_force: error 5 returned.
[16220.088462] XFS (dm-6): xfs_log_force: error 5 returned.
[16220.088464] XFS (dm-10): xfs_log_force: error 5 returned.
[16220.216170] XFS (dm-5): xfs_log_force: error 5 returned.
[16220.216178] XFS (dm-12): xfs_log_force: error 5 returned.
[16220.216325] XFS (dm-11): xfs_log_force: error 5 returned.
[16221.621107] XFS (dm-8): xfs_log_force: error 5 returned.
[16250.102915] XFS (dm-4): xfs_log_force: error 5 returned.
[16250.102918] XFS (dm-6): xfs_log_force: error 5 returned.
[16250.102920] XFS (dm-10): xfs_log_force: error 5 returned.
[16250.230631] XFS (dm-12): xfs_log_force: error 5 returned.
[16250.230633] XFS (dm-11): xfs_log_force: error 5 returned.
[16250.230638] XFS (dm-5): xfs_log_force: error 5 returned.
[16251.635566] XFS (dm-8): xfs_log_force: error 5 returned.
[16280.117368] XFS (dm-4): xfs_log_force: error 5 returned.
[16280.117371] XFS (dm-6): xfs_log_force: error 5 returned.
[16280.117373] XFS (dm-10): xfs_log_force: error 5 returned.
[16280.245087] XFS (dm-12): xfs_log_force: error 5 returned.
[16280.245093] XFS (dm-5): xfs_log_force: error 5 returned.
[16280.245105] XFS (dm-11): xfs_log_force: error 5 returned.
[16281.650017] XFS (dm-8): xfs_log_force: error 5 returned.
[16310.131820] XFS (dm-4): xfs_log_force: error 5 returned.
[16310.131823] XFS (dm-6): xfs_log_force: error 5 returned.
[16310.131825] XFS (dm-10): xfs_log_force: error 5 returned.
[16310.259545] XFS (dm-12): xfs_log_force: error 5 returned.
[16310.259547] XFS (dm-5): xfs_log_force: error 5 returned.
[16310.259552] XFS (dm-11): xfs_log_force: error 5 returned.
[16311.664477] XFS (dm-8): xfs_log_force: error 5 returned.
[16340.146278] XFS (dm-4): xfs_log_force: error 5 returned.
[16340.146280] XFS (dm-6): xfs_log_force: error 5 returned.
[16340.146282] XFS (dm-10): xfs_log_force: error 5 returned.
[16340.274002] XFS (dm-5): xfs_log_force: error 5 returned.
[16340.274005] XFS (dm-12): xfs_log_force: error 5 returned.
[16340.274020] XFS (dm-11): xfs_log_force: error 5 returned.
[16341.678946] XFS (dm-8): xfs_log_force: error 5 returned.
[16370.160749] XFS (dm-4): xfs_log_force: error 5 returned.
[16370.160752] XFS (dm-6): xfs_log_force: error 5 returned.
[16370.160754] XFS (dm-10): xfs_log_force: error 5 returned.
[16370.288460] XFS (dm-5): xfs_log_force: error 5 returned.
[16370.288463] XFS (dm-12): xfs_log_force: error 5 returned.
[16370.288488] XFS (dm-11): xfs_log_force: error 5 returned.
[16371.693394] XFS (dm-8): xfs_log_force: error 5 returned.
[16400.175204] XFS (dm-4): xfs_log_force: error 5 returned.
[16400.175206] XFS (dm-6): xfs_log_force: error 5 returned.
[16400.175208] XFS (dm-10): xfs_log_force: error 5 returned.
[16400.302923] XFS (dm-12): xfs_log_force: error 5 returned.
[16400.302928] XFS (dm-5): xfs_log_force: error 5 returned.
[16400.302957] XFS (dm-11): xfs_log_force: error 5 returned.
[16401.707850] XFS (dm-8): xfs_log_force: error 5 returned.
[16430.189662] XFS (dm-4): xfs_log_force: error 5 returned.
[16430.189664] XFS (dm-6): xfs_log_force: error 5 returned.
[16430.189666] XFS (dm-10): xfs_log_force: error 5 returned.
[16430.317382] XFS (dm-12): xfs_log_force: error 5 returned.
[16430.317387] XFS (dm-5): xfs_log_force: error 5 returned.
[16430.317431] XFS (dm-11): xfs_log_force: error 5 returned.
[16431.722315] XFS (dm-8): xfs_log_force: error 5 returned.
[16460.204118] XFS (dm-4): xfs_log_force: error 5 returned.
[16460.204120] XFS (dm-6): xfs_log_force: error 5 returned.
[16460.204122] XFS (dm-10): xfs_log_force: error 5 returned.
[16460.331843] XFS (dm-12): xfs_log_force: error 5 returned.
[16460.331845] XFS (dm-5): xfs_log_force: error 5 returned.
[16460.331900] XFS (dm-11): xfs_log_force: error 5 returned.
[16461.736774] XFS (dm-8): xfs_log_force: error 5 returned.
[16490.218579] XFS (dm-4): xfs_log_force: error 5 returned.
[16490.218582] XFS (dm-6): xfs_log_force: error 5 returned.
[16490.218584] XFS (dm-10): xfs_log_force: error 5 returned.
[16490.346303] XFS (dm-5): xfs_log_force: error 5 returned.
[16490.346314] XFS (dm-12): xfs_log_force: error 5 returned.
[16490.346378] XFS (dm-11): xfs_log_force: error 5 returned.
[16491.751239] XFS (dm-8): xfs_log_force: error 5 returned.
[16520.233052] XFS (dm-4): xfs_log_force: error 5 returned.
[16520.233054] XFS (dm-6): xfs_log_force: error 5 returned.
[16520.233056] XFS (dm-10): xfs_log_force: error 5 returned.
[16520.360764] XFS (dm-5): xfs_log_force: error 5 returned.
[16520.360767] XFS (dm-12): xfs_log_force: error 5 returned.
[16520.360843] XFS (dm-11): xfs_log_force: error 5 returned.
[16521.765705] XFS (dm-8): xfs_log_force: error 5 returned.
[16550.247509] XFS (dm-4): xfs_log_force: error 5 returned.
[16550.247511] XFS (dm-6): xfs_log_force: error 5 returned.
[16550.247513] XFS (dm-10): xfs_log_force: error 5 returned.
[16550.375228] XFS (dm-12): xfs_log_force: error 5 returned.
[16550.375234] XFS (dm-5): xfs_log_force: error 5 returned.
[16550.375321] XFS (dm-11): xfs_log_force: error 5 returned.
[16551.780153] XFS (dm-8): xfs_log_force: error 5 returned.
[16580.261965] XFS (dm-4): xfs_log_force: error 5 returned.
[16580.261968] XFS (dm-6): xfs_log_force: error 5 returned.
[16580.261970] XFS (dm-10): xfs_log_force: error 5 returned.
[16580.389684] XFS (dm-12): xfs_log_force: error 5 returned.
[16580.389690] XFS (dm-5): xfs_log_force: error 5 returned.
[16580.389740] XFS (dm-11): xfs_log_force: error 5 returned.
[16581.794618] XFS (dm-8): xfs_log_force: error 5 returned.
[16610.276419] XFS (dm-4): xfs_log_force: error 5 returned.
[16610.276421] XFS (dm-6): xfs_log_force: error 5 returned.
[16610.276423] XFS (dm-10): xfs_log_force: error 5 returned.
[16610.404143] XFS (dm-12): xfs_log_force: error 5 returned.
[16610.404146] XFS (dm-5): xfs_log_force: error 5 returned.
[16610.404188] XFS (dm-11): xfs_log_force: error 5 returned.
[16611.809078] XFS (dm-8): xfs_log_force: error 5 returned.
[16640.290878] XFS (dm-4): xfs_log_force: error 5 returned.
[16640.290880] XFS (dm-6): xfs_log_force: error 5 returned.
[16640.290882] XFS (dm-10): xfs_log_force: error 5 returned.
[16640.418602] XFS (dm-5): xfs_log_force: error 5 returned.
[16640.418605] XFS (dm-12): xfs_log_force: error 5 returned.
[16640.418724] XFS (dm-11): xfs_log_force: error 5 returned.
[16641.823546] XFS (dm-8): xfs_log_force: error 5 returned.
[16670.305348] XFS (dm-4): xfs_log_force: error 5 returned.
[16670.305351] XFS (dm-6): xfs_log_force: error 5 returned.
[16670.305353] XFS (dm-10): xfs_log_force: error 5 returned.
[16670.433061] XFS (dm-5): xfs_log_force: error 5 returned.
[16670.433068] XFS (dm-12): xfs_log_force: error 5 returned.
[16670.433128] XFS (dm-11): xfs_log_force: error 5 returned.
[16671.837994] XFS (dm-8): xfs_log_force: error 5 returned.
[16700.319803] XFS (dm-4): xfs_log_force: error 5 returned.
[16700.319805] XFS (dm-6): xfs_log_force: error 5 returned.
[16700.319807] XFS (dm-10): xfs_log_force: error 5 returned.
[16700.447523] XFS (dm-12): xfs_log_force: error 5 returned.
[16700.447528] XFS (dm-5): xfs_log_force: error 5 returned.
[16700.447656] XFS (dm-11): xfs_log_force: error 5 returned.
[16701.852457] XFS (dm-8): xfs_log_force: error 5 returned.
[16730.334256] XFS (dm-4): xfs_log_force: error 5 returned.
[16730.334259] XFS (dm-6): xfs_log_force: error 5 returned.
[16730.334260] XFS (dm-10): xfs_log_force: error 5 returned.
[16730.461976] XFS (dm-12): xfs_log_force: error 5 returned.
[16730.461981] XFS (dm-5): xfs_log_force: error 5 returned.
[16730.462063] XFS (dm-11): xfs_log_force: error 5 returned.
[16731.866907] XFS (dm-8): xfs_log_force: error 5 returned.
[16760.348710] XFS (dm-4): xfs_log_force: error 5 returned.
[16760.348713] XFS (dm-6): xfs_log_force: error 5 returned.
[16760.348716] XFS (dm-10): xfs_log_force: error 5 returned.
[16760.476434] XFS (dm-12): xfs_log_force: error 5 returned.
[16760.476436] XFS (dm-5): xfs_log_force: error 5 returned.
[16760.476459] XFS (dm-11): xfs_log_force: error 5 returned.
[16761.881365] XFS (dm-8): xfs_log_force: error 5 returned.
[16790.363166] XFS (dm-4): xfs_log_force: error 5 returned.
[16790.363169] XFS (dm-6): xfs_log_force: error 5 returned.
[16790.363171] XFS (dm-10): xfs_log_force: error 5 returned.
[16790.490890] XFS (dm-5): xfs_log_force: error 5 returned.
[16790.490901] XFS (dm-12): xfs_log_force: error 5 returned.
[16790.491022] XFS (dm-11): xfs_log_force: error 5 returned.
[16791.895824] XFS (dm-8): xfs_log_force: error 5 returned.
[16798.555345] XFS (dm-4): xfs_log_force: error 5 returned.
[16798.555352] XFS (dm-4): xfs_do_force_shutdown(0x1) called from line
1031 of file fs/xfs/xfs_buf.c. Return address = 0xffffffffa025d2a9
[16798.584982] XFS (dm-4): xfs_log_force: error 5 returned.
[16798.623227] XFS (dm-4): xfs_log_force: error 5 returned.
[16806.100829] XFS (dm-6): xfs_log_force: error 5 returned.
[16806.100837] XFS (dm-6): xfs_do_force_shutdown(0x1) called from line
1031 of file fs/xfs/xfs_buf.c. Return address = 0xffffffffa025d2a9
[16806.131651] XFS (dm-6): xfs_log_force: error 5 returned.
[16806.171364] XFS (dm-6): xfs_log_force: error 5 returned.
[16812.785100] XFS (dm-8): xfs_log_force: error 5 returned.
[16812.785107] XFS (dm-8): xfs_do_force_shutdown(0x1) called from line
1031 of file fs/xfs/xfs_buf.c. Return address = 0xffffffffa025d2a9
[16812.814558] XFS (dm-8): xfs_log_force: error 5 returned.
[16812.849566] XFS (dm-8): xfs_log_force: error 5 returned.
[16816.295231] XFS (dm-10): xfs_log_force: error 5 returned.
[16816.295238] XFS (dm-10): xfs_do_force_shutdown(0x1) called from line
1031 of file fs/xfs/xfs_buf.c. Return address = 0xffffffffa025d2a9
[16816.325382] XFS (dm-10): xfs_log_force: error 5 returned.
[16816.363627] XFS (dm-10): xfs_log_force: error 5 returned.
[16820.505348] XFS (dm-5): xfs_log_force: error 5 returned.
[16820.505351] XFS (dm-11): xfs_log_force: error 5 returned.
[16820.505354] XFS (dm-12): xfs_log_force: error 5 returned.
[16822.543907] XFS (dm-11): xfs_log_force: error 5 returned.
[16822.543915] XFS (dm-11): xfs_do_force_shutdown(0x1) called from line
1031 of file fs/xfs/xfs_buf.c. Return address = 0xffffffffa025d2a9
[16822.573578] XFS (dm-11): xfs_log_force: error 5 returned.
[16822.610507] XFS (dm-11): xfs_log_force: error 5 returned.
[16829.659428] XFS (dm-5): xfs_log_force: error 5 returned.
[16829.659435] XFS (dm-5): xfs_do_force_shutdown(0x1) called from line
1031 of file fs/xfs/xfs_buf.c. Return address = 0xffffffffa025d2a9
[16829.692325] XFS (dm-5): xfs_log_force: error 5 returned.
[16829.729706] XFS (dm-5): xfs_log_force: error 5 returned.
[16833.529927] XFS (dm-12): xfs_log_force: error 5 returned.
[16833.529935] XFS (dm-12): xfs_do_force_shutdown(0x1) called from line
1031 of file fs/xfs/xfs_buf.c. Return address = 0xffffffffa025d2a9
[16833.559988] XFS (dm-12): xfs_log_force: error 5 returned.
[16833.599082] XFS (dm-12): xfs_log_force: error 5 returned.
[17369.624945] XFS (dm-4): Mounting Filesystem
[17369.632013] XFS (dm-4): Starting recovery (logdev: internal)
[17369.638476] XFS (dm-4): Ending recovery (logdev: internal)
[17651.973225] XFS (dm-6): Mounting Filesystem
[17651.979557] XFS (dm-6): Starting recovery (logdev: internal)
[17651.986804] XFS (dm-6): Ending recovery (logdev: internal)
[17714.780957] XFS (dm-8): Mounting Filesystem
[17715.040586] XFS (dm-8): Starting recovery (logdev: internal)
[17716.476591] XFS (dm-8): Ending recovery (logdev: internal)
[17786.815091] XFS (dm-10): Mounting Filesystem
[17786.984170] XFS (dm-10): Starting recovery (logdev: internal)
[17787.413091] XFS (dm-10): Ending recovery (logdev: internal)
[17901.190404] XFS (dm-5): Mounting Filesystem
[17901.370431] XFS (dm-5): Starting recovery (logdev: internal)
[17901.775778] XFS (dm-5): Ending recovery (logdev: internal)
[23766.709975] XFS (dm-11): Mounting Filesystem
[23766.716253] XFS (dm-11): Starting recovery (logdev: internal)
[23766.723042] XFS (dm-11): Ending recovery (logdev: internal)
[23826.394066] XFS (dm-12): Mounting Filesystem
[23826.400403] XFS (dm-12): Starting recovery (logdev: internal)
[23826.407246] XFS (dm-12): Ending recovery (logdev: internal)

***@Anguish-ssu-2:/iris/home/adrian/streamRT/scripts# xfs_repair
/dev/s2d_a1l001
Phase 1 - find and verify superblock...
Phase 2 - using internal log
- zero log...
- scan filesystem freespace and inode maps...
- found root inode chunk
Phase 3 - for each AG...
- scan and clear agi unlinked lists...
- process known inodes and perform inode discovery...
- agno = 0
- agno = 1
- agno = 2
- agno = 3
- agno = 4
- agno = 5
- agno = 6
- agno = 7
- agno = 8
- agno = 9
- agno = 10
- agno = 11
- agno = 12
- agno = 13
- agno = 14
- agno = 15
- agno = 16
- agno = 17
- agno = 18
- agno = 19
- agno = 20
- agno = 21
- agno = 22
- agno = 23
- agno = 24
- agno = 25
- agno = 26
- agno = 27
- agno = 28
- agno = 29
- agno = 30
- agno = 31
- agno = 32
- agno = 33
- agno = 34
- agno = 35
- agno = 36
- agno = 37
- agno = 38
- agno = 39
- agno = 40
- agno = 41
- agno = 42
- agno = 43
- process newly discovered inodes...
Phase 4 - check for duplicate blocks...
- setting up duplicate extent list...
- check for inodes claiming duplicate blocks...
- agno = 1
- agno = 0
- agno = 2
- agno = 3
- agno = 4
- agno = 5
- agno = 6
- agno = 8
- agno = 7
- agno = 9
- agno = 10
- agno = 16
- agno = 15
- agno = 13
- agno = 12
- agno = 14
- agno = 11
- agno = 19
- agno = 20
- agno = 21
- agno = 18
- agno = 23
- agno = 22
- agno = 17
- agno = 24
- agno = 25
- agno = 26
- agno = 27
- agno = 28
- agno = 30
- agno = 29
- agno = 31
- agno = 32
- agno = 33
- agno = 34
- agno = 35
- agno = 36
- agno = 37
- agno = 38
- agno = 39
- agno = 40
- agno = 42
- agno = 43
- agno = 41
Phase 5 - rebuild AG headers and trees...
- reset superblock...
Phase 6 - check inode connectivity...
- resetting contents of realtime bitmap and summary inodes
- traversing filesystem ...
- traversal finished ...
- moving disconnected inodes to lost+found ...
Phase 7 - verify and correct link counts...
done
***@Anguish-ssu-2:/iris/home/adrian/streamRT/scripts# xfs_repair
/dev/s2d_a1l011
Phase 1 - find and verify superblock...
Phase 2 - using internal log
- zero log...
- scan filesystem freespace and inode maps...
- found root inode chunk
Phase 3 - for each AG...
- scan and clear agi unlinked lists...
- process known inodes and perform inode discovery...
- agno = 0
- agno = 1
- agno = 2
- agno = 3
- agno = 4
- agno = 5
- agno = 6
- agno = 7
- agno = 8
- agno = 9
- agno = 10
- agno = 11
- agno = 12
- agno = 13
- agno = 14
- agno = 15
- agno = 16
- agno = 17
- agno = 18
- agno = 19
- agno = 20
- agno = 21
- agno = 22
- agno = 23
- agno = 24
- agno = 25
- agno = 26
- agno = 27
- agno = 28
- agno = 29
- agno = 30
- agno = 31
- agno = 32
- agno = 33
- agno = 34
- agno = 35
- agno = 36
- agno = 37
- agno = 38
- agno = 39
- agno = 40
- agno = 41
- agno = 42
- agno = 43
- process newly discovered inodes...
Phase 4 - check for duplicate blocks...
- setting up duplicate extent list...
- check for inodes claiming duplicate blocks...
- agno = 0
- agno = 1
- agno = 2
- agno = 3
- agno = 4
- agno = 5
- agno = 6
- agno = 7
- agno = 12
- agno = 13
- agno = 14
- agno = 15
- agno = 17
- agno = 18
- agno = 9
- agno = 8
- agno = 19
- agno = 22
- agno = 20
- agno = 16
- agno = 23
- agno = 21
- agno = 10
- agno = 24
- agno = 11
- agno = 26
- agno = 27
- agno = 29
- agno = 30
- agno = 31
- agno = 28
- agno = 25
- agno = 32
- agno = 33
- agno = 34
- agno = 35
- agno = 36
- agno = 37
- agno = 38
- agno = 39
- agno = 40
- agno = 41
- agno = 42
- agno = 43
Phase 5 - rebuild AG headers and trees...
- reset superblock...
Phase 6 - check inode connectivity...
- resetting contents of realtime bitmap and summary inodes
- traversing filesystem ...
- traversal finished ...
- moving disconnected inodes to lost+found ...
Phase 7 - verify and correct link counts...
done
***@Anguish-ssu-2:/iris/home/adrian/streamRT/scripts# xfs_repair
/dev/s2d_a1l021
Phase 1 - find and verify superblock...
Phase 2 - using internal log
- zero log...
- scan filesystem freespace and inode maps...
- found root inode chunk
Phase 3 - for each AG...
- scan and clear agi unlinked lists...
- process known inodes and perform inode discovery...
- agno = 0
- agno = 1
- agno = 2
- agno = 3
- agno = 4
- agno = 5
- agno = 6
- agno = 7
- agno = 8
- agno = 9
- agno = 10
- agno = 11
- agno = 12
- agno = 13
- agno = 14
- agno = 15
- agno = 16
- agno = 17
- agno = 18
- agno = 19
- agno = 20
- agno = 21
- agno = 22
- agno = 23
- agno = 24
- agno = 25
- agno = 26
- agno = 27
- agno = 28
- agno = 29
- agno = 30
- agno = 31
- agno = 32
- agno = 33
- agno = 34
- agno = 35
- agno = 36
- agno = 37
- agno = 38
- agno = 39
- agno = 40
- agno = 41
- agno = 42
- agno = 43
- process newly discovered inodes...
Phase 4 - check for duplicate blocks...
- setting up duplicate extent list...
- check for inodes claiming duplicate blocks...
- agno = 0
- agno = 1
- agno = 2
- agno = 3
- agno = 5
- agno = 4
- agno = 6
- agno = 8
- agno = 9
- agno = 7
- agno = 12
- agno = 10
- agno = 15
- agno = 16
- agno = 11
- agno = 13
- agno = 14
- agno = 17
- agno = 21
- agno = 19
- agno = 20
- agno = 22
- agno = 26
- agno = 28
- agno = 25
- agno = 31
- agno = 27
- agno = 24
- agno = 18
- agno = 29
- agno = 23
- agno = 30
- agno = 32
- agno = 33
- agno = 34
- agno = 35
- agno = 36
- agno = 37
- agno = 38
- agno = 39
- agno = 40
- agno = 41
- agno = 42
- agno = 43
Phase 5 - rebuild AG headers and trees...
- reset superblock...
Phase 6 - check inode connectivity...
- resetting contents of realtime bitmap and summary inodes
- traversing filesystem ...
- traversal finished ...
- moving disconnected inodes to lost+found ...
Phase 7 - verify and correct link counts...
done
***@Anguish-ssu-2:/iris/home/adrian/streamRT/scripts# xfs_repair
/dev/s2d_a1l031
Phase 1 - find and verify superblock...
Phase 2 - using internal log
- zero log...
- scan filesystem freespace and inode maps...
- found root inode chunk
Phase 3 - for each AG...
- scan and clear agi unlinked lists...
- process known inodes and perform inode discovery...
- agno = 0
- agno = 1
- agno = 2
- agno = 3
- agno = 4
- agno = 5
- agno = 6
- agno = 7
- agno = 8
- agno = 9
- agno = 10
- agno = 11
- agno = 12
- agno = 13
- agno = 14
- agno = 15
- agno = 16
- agno = 17
- agno = 18
- agno = 19
- agno = 20
- agno = 21
- agno = 22
- agno = 23
- agno = 24
- agno = 25
- agno = 26
- agno = 27
- agno = 28
- agno = 29
- agno = 30
- agno = 31
- agno = 32
- agno = 33
- agno = 34
- agno = 35
- agno = 36
- agno = 37
- agno = 38
- agno = 39
- agno = 40
- agno = 41
- agno = 42
- agno = 43
- process newly discovered inodes...
Phase 4 - check for duplicate blocks...
- setting up duplicate extent list...
- check for inodes claiming duplicate blocks...
- agno = 1
- agno = 0
- agno = 2
- agno = 3
- agno = 4
- agno = 5
- agno = 6
- agno = 8
- agno = 7
- agno = 11
- agno = 9
- agno = 10
- agno = 13
- agno = 14
- agno = 18
- agno = 19
- agno = 21
- agno = 16
- agno = 23
- agno = 12
- agno = 15
- agno = 17
- agno = 24
- agno = 20
- agno = 25
- agno = 22
- agno = 27
- agno = 29
- agno = 26
- agno = 31
- agno = 30
- agno = 28
- agno = 32
- agno = 33
- agno = 34
- agno = 35
- agno = 36
- agno = 37
- agno = 38
- agno = 39
- agno = 40
- agno = 41
- agno = 42
- agno = 43
Phase 5 - rebuild AG headers and trees...
- reset superblock...
Phase 6 - check inode connectivity...
- resetting contents of realtime bitmap and summary inodes
- traversing filesystem ...
- traversal finished ...
- moving disconnected inodes to lost+found ...
Phase 7 - verify and correct link counts...
done
***@Anguish-ssu-2:/iris/home/adrian/streamRT/scripts# xfs_repair
/dev/s2d_a1l003
Phase 1 - find and verify superblock...
Phase 2 - using internal log
- zero log...
- scan filesystem freespace and inode maps...
- found root inode chunk
Phase 3 - for each AG...
- scan and clear agi unlinked lists...
- process known inodes and perform inode discovery...
- agno = 0
- agno = 1
- agno = 2
- agno = 3
- agno = 4
- agno = 5
- agno = 6
- agno = 7
- agno = 8
- agno = 9
- agno = 10
- agno = 11
- agno = 12
- agno = 13
- agno = 14
- agno = 15
- agno = 16
- agno = 17
- agno = 18
- agno = 19
- agno = 20
- agno = 21
- agno = 22
- agno = 23
- agno = 24
- agno = 25
- agno = 26
- agno = 27
- agno = 28
- agno = 29
- agno = 30
- agno = 31
- agno = 32
- agno = 33
- agno = 34
- agno = 35
- agno = 36
- agno = 37
- agno = 38
- agno = 39
- agno = 40
- agno = 41
- agno = 42
- agno = 43
- process newly discovered inodes...
Phase 4 - check for duplicate blocks...
- setting up duplicate extent list...
- check for inodes claiming duplicate blocks...
- agno = 0
- agno = 1
- agno = 2
- agno = 3
- agno = 5
- agno = 6
- agno = 8
- agno = 4
- agno = 10
- agno = 9
- agno = 12
- agno = 14
- agno = 16
- agno = 11
- agno = 13
- agno = 19
- agno = 21
- agno = 15
- agno = 7
- agno = 17
- agno = 18
- agno = 23
- agno = 22
- agno = 20
- agno = 24
- agno = 26
- agno = 28
- agno = 30
- agno = 27
- agno = 29
- agno = 31
- agno = 25
- agno = 32
- agno = 33
- agno = 34
- agno = 35
- agno = 36
- agno = 37
- agno = 38
- agno = 39
- agno = 40
- agno = 41
- agno = 42
- agno = 43
Phase 5 - rebuild AG headers and trees...
- reset superblock...
Phase 6 - check inode connectivity...
- resetting contents of realtime bitmap and summary inodes
- traversing filesystem ...
- traversal finished ...
- moving disconnected inodes to lost+found ...
Phase 7 - verify and correct link counts...
done
***@Anguish-ssu-2:/iris/home/adrian/streamRT/scripts# xfs_repair
/dev/s2d_a1l013
Phase 1 - find and verify superblock...
Phase 2 - using internal log
- zero log...
- scan filesystem freespace and inode maps...
- found root inode chunk
Phase 3 - for each AG...
- scan and clear agi unlinked lists...
- process known inodes and perform inode discovery...
- agno = 0
- agno = 1
- agno = 2
- agno = 3
- agno = 4
- agno = 5
- agno = 6
- agno = 7
- agno = 8
- agno = 9
- agno = 10
- agno = 11
- agno = 12
- agno = 13
- agno = 14
- agno = 15
- agno = 16
- agno = 17
- agno = 18
- agno = 19
- agno = 20
- agno = 21
- agno = 22
- agno = 23
- agno = 24
- agno = 25
- agno = 26
- agno = 27
- agno = 28
- agno = 29
- agno = 30
- agno = 31
- agno = 32
- agno = 33
- agno = 34
- agno = 35
- agno = 36
- agno = 37
- agno = 38
- agno = 39
- agno = 40
- agno = 41
- agno = 42
- agno = 43
- process newly discovered inodes...
Phase 4 - check for duplicate blocks...
- setting up duplicate extent list...
- check for inodes claiming duplicate blocks...
- agno = 0
- agno = 1
- agno = 2
- agno = 3
- agno = 4
- agno = 6
- agno = 7
- agno = 5
- agno = 9
- agno = 12
- agno = 13
- agno = 14
- agno = 8
- agno = 15
- agno = 10
- agno = 16
- agno = 11
- agno = 17
- agno = 19
- agno = 18
- agno = 20
- agno = 21
- agno = 22
- agno = 23
- agno = 24
- agno = 25
- agno = 26
- agno = 27
- agno = 28
- agno = 29
- agno = 30
- agno = 31
- agno = 32
- agno = 33
- agno = 34
- agno = 35
- agno = 36
- agno = 37
- agno = 38
- agno = 39
- agno = 40
- agno = 41
- agno = 42
- agno = 43
Phase 5 - rebuild AG headers and trees...
- reset superblock...
Phase 6 - check inode connectivity...
- resetting contents of realtime bitmap and summary inodes
- traversing filesystem ...
- traversal finished ...
- moving disconnected inodes to lost+found ...
Phase 7 - verify and correct link counts...
done
***@Anguish-ssu-2:/iris/home/adrian/streamRT/scripts# xfs_repair
/dev/s2d_a1l023
Phase 1 - find and verify superblock...
Phase 2 - using internal log
- zero log...
- scan filesystem freespace and inode maps...
- found root inode chunk
Phase 3 - for each AG...
- scan and clear agi unlinked lists...
- process known inodes and perform inode discovery...
- agno = 0
- agno = 1
- agno = 2
- agno = 3
- agno = 4
- agno = 5
- agno = 6
- agno = 7
- agno = 8
- agno = 9
- agno = 10
- agno = 11
- agno = 12
- agno = 13
- agno = 14
- agno = 15
- agno = 16
- agno = 17
- agno = 18
- agno = 19
- agno = 20
- agno = 21
- agno = 22
- agno = 23
- agno = 24
- agno = 25
- agno = 26
- agno = 27
- agno = 28
- agno = 29
- agno = 30
- agno = 31
- agno = 32
- agno = 33
- agno = 34
- agno = 35
- agno = 36
- agno = 37
- agno = 38
- agno = 39
- agno = 40
- agno = 41
- agno = 42
- agno = 43
- process newly discovered inodes...
Phase 4 - check for duplicate blocks...
- setting up duplicate extent list...
- check for inodes claiming duplicate blocks...
- agno = 0
- agno = 1
- agno = 2
- agno = 4
- agno = 5
- agno = 3
- agno = 8
- agno = 10
- agno = 12
- agno = 9
- agno = 15
- agno = 16
- agno = 6
- agno = 13
- agno = 7
- agno = 19
- agno = 11
- agno = 20
- agno = 18
- agno = 17
- agno = 21
- agno = 14
- agno = 22
- agno = 26
- agno = 27
- agno = 30
- agno = 31
- agno = 28
- agno = 29
- agno = 23
- agno = 24
- agno = 25
- agno = 32
- agno = 33
- agno = 34
- agno = 35
- agno = 36
- agno = 37
- agno = 38
- agno = 39
- agno = 40
- agno = 41
- agno = 42
- agno = 43
Phase 5 - rebuild AG headers and trees...
- reset superblock...
Phase 6 - check inode connectivity...
- resetting contents of realtime bitmap and summary inodes
- traversing filesystem ...
- traversal finished ...
- moving disconnected inodes to lost+found ...
Phase 7 - verify and correct link counts...
done
***@Anguish-ssu-2:/iris/home/adrian/streamRT/scripts# xfs_repair
/dev/s2d_a1l033
Phase 1 - find and verify superblock...
Phase 2 - using internal log
- zero log...
- scan filesystem freespace and inode maps...
- found root inode chunk
Phase 3 - for each AG...
- scan and clear agi unlinked lists...
- process known inodes and perform inode discovery...
- agno = 0
- agno = 1
- agno = 2
- agno = 3
- agno = 4
- agno = 5
- agno = 6
- agno = 7
- agno = 8
- agno = 9
- agno = 10
- agno = 11
- agno = 12
- agno = 13
- agno = 14
- agno = 15
- agno = 16
- agno = 17
- agno = 18
- agno = 19
- agno = 20
- agno = 21
- agno = 22
- agno = 23
- agno = 24
- agno = 25
- agno = 26
- agno = 27
- agno = 28
- agno = 29
- agno = 30
- agno = 31
- agno = 32
- agno = 33
- agno = 34
- agno = 35
- agno = 36
- agno = 37
- agno = 38
- agno = 39
- agno = 40
- agno = 41
- agno = 42
- agno = 43
- process newly discovered inodes...
Phase 4 - check for duplicate blocks...
- setting up duplicate extent list...
- check for inodes claiming duplicate blocks...
- agno = 0
- agno = 3
- agno = 2
- agno = 6
- agno = 4
- agno = 5
- agno = 11
- agno = 8
- agno = 16
- agno = 7
- agno = 10
- agno = 13
- agno = 24
- agno = 23
- agno = 28
- agno = 25
- agno = 30
- agno = 32
- agno = 17
- agno = 34
- agno = 37
- agno = 12
- agno = 42
- agno = 22
- agno = 20
- agno = 14
- agno = 29
- agno = 31
- agno = 1
- agno = 26
- agno = 27
- agno = 15
- agno = 35
- agno = 21
- agno = 36
- agno = 41
- agno = 33
- agno = 9
- agno = 38
- agno = 43
- agno = 40
- agno = 39
- agno = 19
- agno = 18
Phase 5 - rebuild AG headers and trees...
- reset superblock...
Phase 6 - check inode connectivity...
- resetting contents of realtime bitmap and summary inodes
- traversing filesystem ...
- traversal finished ...
- moving disconnected inodes to lost+found ...
Phase 7 - verify and correct link counts...
done
--
Stan
Dave Chinner
2014-08-29 23:55:38 UTC
Permalink
Post by Stan Hoeppner
Post by Dave Chinner
Post by Stan Hoeppner
Post by Dave Chinner
Post by Stan Hoeppner
xfs_do_force_shutdown(0x8) called from line 3732 of file
fs/xfs/xfs_bmap.c.
Return address = 0xffffffffa01cc9a6
Yup, that's kinda important. That's from xfs_bmap_finish(), and
freeing an extent has failed and triggered SHUTDOWN_CORRUPT_INCORE
which it's found some kind of inconsistency in the free space
btrees. So, likely the same problem that caused EFI recovery to fail
on the other volume.
Are the tests being run on newly made filesystems? If not, have
these filesystems had xfs_repair run on them after a failure? If
so, what is the error that is fixed? If not, does repairing the
filesystem make the problem go away?
Newly made after every error of any kind, whether app, XFS shutdown,
call
Post by Dave Chinner
Post by Stan Hoeppner
trace, etc. I've not attempted xfs_repair.
Please do.
Another storage crash yesterday. xfs_repair output inline below for the 7
filesystems. I'm also pasting the dmesg output. This time there is no
oops, no call traces. The filesystems mounted fine after mounting,
replaying, and repairing.
Ok, what version of xfs_repair did you use?
Post by Stan Hoeppner
Post by Dave Chinner
The bug? The bleeding edge storage arrays being used had had a
firmware bug in it. When the number of outstanding IOs hit the
*array controller* command tag queue depth limit (some several
thousand simultaneous IOs in flight) it would occasionally misdirect
a single write IO to the *wrong lun*. i.e. it would misdirect a
write.
It was only under *extreme* loads that this would happen, and it's
this sort of load that AIO+DIO can easily generate - you can have
several thousand IOs in flight without too much hassle, and that
will hit limits in the storage arrays that aren't often hit. Array
controller CTQ depth limits are a good example of a limit that
normal IO won't go near to stressing.
I hadn't considered that up to this point. That is *very* insightful, and
applicable, since we are dealing with a beta storage array and firmware.
Worth mentioning is that the storage vendor has added a custom routine
which expends Herculean effort to identify full stripes before writeback.
Hmmmm. Food for thought, especially as it is evident that the
storage array appears to be crashing completely. At this point,
I'd say the burden of finding a corruption needs to start with
proving that the array is has not done something wrong. Once you
know that what is on disk is exactly what the filesystem asked to be
written, then you can start to isolate filesystem issues. But you
need the storage to be solid and trust-worthy before going looking
for filesystem problems....
Post by Stan Hoeppner
This because some of our writes for a given low rate stream are as low as
32KB and may be 2-3 seconds apart. With a 64-128KB chunk, 768 to 1536KB
stripe width, we'd get massive RMW without this feature. Testing thus far
shows it is fairly effective, though we still get pretty serious RMW due to
the fact we're writing 350 of these small streams per array at ~72 KB/s
max, along with 2 streams at ~48 MB/s, and and 50 streams at ~1.2 MB/s.
Multiply this by 7 LUNs per controller and it becomes clear we're putting a
pretty serious load on the firmware and cache.
Yup, so having the array cache do the equivalent of sequential
readahead multi-stream detection for writeback would make a big
difference. But not simple to do....

Cheers,

Dave.
--
Dave Chinner
***@fromorbit.com
Stan Hoeppner
2014-08-30 02:55:53 UTC
Permalink
Post by Dave Chinner
Post by Stan Hoeppner
Post by Dave Chinner
On Thu, 28 Aug 2014 10:32:27 +1000, Dave Chinner
Post by Dave Chinner
Post by Stan Hoeppner
xfs_do_force_shutdown(0x8) called from line 3732 of file fs/xfs/xfs_bmap.c.
Return address = 0xffffffffa01cc9a6
Yup, that's kinda important. That's from xfs_bmap_finish(), and
freeing an extent has failed and triggered SHUTDOWN_CORRUPT_INCORE
which it's found some kind of inconsistency in the free space
btrees. So, likely the same problem that caused EFI recovery to fail
on the other volume.
Are the tests being run on newly made filesystems? If not, have
these filesystems had xfs_repair run on them after a failure? If
so, what is the error that is fixed? If not, does repairing the
filesystem make the problem go away?
Newly made after every error of any kind, whether app, XFS shutdown,
call
Post by Dave Chinner
trace, etc. I've not attempted xfs_repair.
Please do.
Another storage crash yesterday. xfs_repair output inline below for
the
Post by Dave Chinner
Post by Stan Hoeppner
7
filesystems. I'm also pasting the dmesg output. This time there is no
oops, no call traces. The filesystems mounted fine after mounting,
replaying, and repairing.
Ok, what version of xfs_repair did you use?
3.1.4 which is a little long in the tooth. I believe they built the OS
image from Squeeze 6.0. I was originally told it was Wheezy 7.0, but that
turns out to have been false.
Post by Dave Chinner
Post by Stan Hoeppner
Post by Dave Chinner
The bug? The bleeding edge storage arrays being used had had a
firmware bug in it. When the number of outstanding IOs hit the
*array controller* command tag queue depth limit (some several
thousand simultaneous IOs in flight) it would occasionally misdirect
a single write IO to the *wrong lun*. i.e. it would misdirect a
write.
It was only under *extreme* loads that this would happen, and it's
this sort of load that AIO+DIO can easily generate - you can have
several thousand IOs in flight without too much hassle, and that
will hit limits in the storage arrays that aren't often hit. Array
controller CTQ depth limits are a good example of a limit that
normal IO won't go near to stressing.
I hadn't considered that up to this point. That is *very* insightful, and
applicable, since we are dealing with a beta storage array and firmware.
Worth mentioning is that the storage vendor has added a custom routine
which expends Herculean effort to identify full stripes before writeback.
Hmmmm. Food for thought, especially as it is evident that the
storage array appears to be crashing completely. At this point,
I'd say the burden of finding a corruption needs to start with
proving that the array is has not done something wrong. Once you
know that what is on disk is exactly what the filesystem asked to be
written, then you can start to isolate filesystem issues. But you
need the storage to be solid and trust-worthy before going looking
for filesystem problems....
Agreed. Which is why I put storage first in the subject, AIO second, and
XFS third. My initial instinct was a problem with libaio, as the crashes
only surfaced writing with AIO. I'm now seeing problems with storage on
both systems when not using AIO. We're supposed to receive a new firmware
upload next week, so hopefully that will fix some of these issues.
Post by Dave Chinner
Post by Stan Hoeppner
This because some of our writes for a given low rate stream are as low as
32KB and may be 2-3 seconds apart. With a 64-128KB chunk, 768 to 1536KB
stripe width, we'd get massive RMW without this feature. Testing thus far
shows it is fairly effective, though we still get pretty serious RMW
due
Post by Dave Chinner
Post by Stan Hoeppner
to
the fact we're writing 350 of these small streams per array at ~72 KB/s
max, along with 2 streams at ~48 MB/s, and and 50 streams at ~1.2 MB/s.
Multiply this by 7 LUNs per controller and it becomes clear we're putting a
pretty serious load on the firmware and cache.
Yup, so having the array cache do the equivalent of sequential
readahead multi-stream detection for writeback would make a big
difference. But not simple to do....
Not at all, especially with only 3 GB of RAM to work with, as I'm told.
Seems low for a high end controller with 4x 12G SAS ports. We're only able
to achieve ~250 MB/s per array at the application due to the access pattern
being essentially random, and still with a serious quantity of RMWs. Which
is why we're going to test with an even smaller chunk of 32KB. I believe
that's the lower bound on these controllers. For this workload 16KB or
maybe even 8KB would likely be more optimal. We're also going to test with
bcache and a 400 GB Intel 3700 (datacenter grade) SSD backing two LUNs.
But with bcache chunk size should be far less relevant. I'm anxious to
kick those tires, but it'll be a couple of weeks.

Have you played with bcache yet?
--
Stan
Dave Chinner
2014-08-31 23:57:49 UTC
Permalink
Post by Stan Hoeppner
Post by Dave Chinner
Post by Stan Hoeppner
Another storage crash yesterday. xfs_repair output inline below for the 7
filesystems. I'm also pasting the dmesg output. This time there is no
oops, no call traces. The filesystems mounted fine after mounting,
replaying, and repairing.
Ok, what version of xfs_repair did you use?
3.1.4 which is a little long in the tooth.
And so not useful for th epurposes of finding free space tree
corruptions. Old xfs_repair versions only rebuild the freespace
trees - they don't check them first. IOWs, silence from an old
xfs_repair does not mean the filesystem was free of errors.
Post by Stan Hoeppner
Post by Dave Chinner
Post by Stan Hoeppner
This because some of our writes for a given low rate stream are as low as
32KB and may be 2-3 seconds apart. With a 64-128KB chunk, 768 to 1536KB
stripe width, we'd get massive RMW without this feature. Testing thus far
shows it is fairly effective, though we still get pretty serious RMW due to
the fact we're writing 350 of these small streams per array at ~72 KB/s
max, along with 2 streams at ~48 MB/s, and and 50 streams at ~1.2 MB/s.
Multiply this by 7 LUNs per controller and it becomes clear we're putting a
pretty serious load on the firmware and cache.
Yup, so having the array cache do the equivalent of sequential
readahead multi-stream detection for writeback would make a big
difference. But not simple to do....
Not at all, especially with only 3 GB of RAM to work with, as I'm told.
Seems low for a high end controller with 4x 12G SAS ports. We're only able
to achieve ~250 MB/s per array at the application due to the access pattern
being essentially random, and still with a serious quantity of RMWs. Which
is why we're going to test with an even smaller chunk of 32KB. I believe
that's the lower bound on these controllers. For this workload 16KB or
maybe even 8KB would likely be more optimal. We're also going to test with
bcache and a 400 GB Intel 3700 (datacenter grade) SSD backing two LUNs.
But with bcache chunk size should be far less relevant. I'm anxious to
kick those tires, but it'll be a couple of weeks.
Have you played with bcache yet?
Enough to scare me. So many ways for things to go wrong, no easy way
to recover when things go wrong. And that's before I even get to
performance warts, like having systems stall completely because
there's tens or hundreds of GB of 4k random writes that have to be
flushed to slow SATA RAID6 in the cache....

Cheers,

Dave.

PS: can you wrap your text at 68 or 72 columns so quoted text
doesn't overflow 80 columns and get randomly wrapped and messed up?
--
Dave Chinner
***@fromorbit.com
stan hoeppner
2014-09-01 03:36:25 UTC
Permalink
Post by Dave Chinner
Post by Stan Hoeppner
Post by Dave Chinner
Post by Stan Hoeppner
Another storage crash yesterday. xfs_repair output inline below for the 7
filesystems. I'm also pasting the dmesg output. This time there is no
oops, no call traces. The filesystems mounted fine after mounting,
replaying, and repairing.
Ok, what version of xfs_repair did you use?
3.1.4 which is a little long in the tooth.
And so not useful for th epurposes of finding free space tree
corruptions. Old xfs_repair versions only rebuild the freespace
trees - they don't check them first. IOWs, silence from an old
xfs_repair does not mean the filesystem was free of errors.
Post by Stan Hoeppner
Post by Dave Chinner
Post by Stan Hoeppner
This because some of our writes for a given low rate stream are as low as
32KB and may be 2-3 seconds apart. With a 64-128KB chunk, 768 to 1536KB
stripe width, we'd get massive RMW without this feature. Testing thus far
shows it is fairly effective, though we still get pretty serious RMW due to
the fact we're writing 350 of these small streams per array at ~72 KB/s
max, along with 2 streams at ~48 MB/s, and and 50 streams at ~1.2 MB/s.
Multiply this by 7 LUNs per controller and it becomes clear we're putting a
pretty serious load on the firmware and cache.
Yup, so having the array cache do the equivalent of sequential
readahead multi-stream detection for writeback would make a big
difference. But not simple to do....
Not at all, especially with only 3 GB of RAM to work with, as I'm told.
Seems low for a high end controller with 4x 12G SAS ports. We're only able
to achieve ~250 MB/s per array at the application due to the access pattern
being essentially random, and still with a serious quantity of RMWs. Which
is why we're going to test with an even smaller chunk of 32KB. I believe
that's the lower bound on these controllers. For this workload 16KB or
maybe even 8KB would likely be more optimal. We're also going to test with
bcache and a 400 GB Intel 3700 (datacenter grade) SSD backing two LUNs.
But with bcache chunk size should be far less relevant. I'm anxious to
kick those tires, but it'll be a couple of weeks.
Have you played with bcache yet?
Enough to scare me. So many ways for things to go wrong, no easy way
to recover when things go wrong. And that's before I even get to
performance warts, like having systems stall completely because
there's tens or hundreds of GB of 4k random writes that have to be
flushed to slow SATA RAID6 in the cache....
Yikes. I hadn't yet heard such opinions expressed. By go wrong I
assume you mean the btrees or cached sector data getting broken, corrupted?
Post by Dave Chinner
Cheers,
Dave.
PS: can you wrap your text at 68 or 72 columns so quoted text
doesn't overflow 80 columns and get randomly wrapped and messed up?
This email should be. Lemme see what I can do with the others. The
lovely Cisco VPN client I must use kills routing to my local subnet, so
Icedovce can't connect to my IMAP server when the VPN is active. The
test hardness app requires a shell unfortunately so I have to keep the
tunnel open all the time, as the test runs are 40+ hours each. My last
test just crashed a bit ago so I can use Icedove for this reply.

I've been using Roundcube, an older version, which doesn't let me set
the line wrap, at least not in the web GUI, might be in the config. I
normally only use it when I'm remote, which is rare, so I've not kept it
updated.

Lemme see if I can open the firewall and get IMAP working through their
VPN link so I can use Icedove. Sorry for the inconvenience. Believe
me, it negatively affects me more than you. :(

Stan
Dave Chinner
2014-09-01 23:45:29 UTC
Permalink
Post by stan hoeppner
Post by Dave Chinner
Post by Stan Hoeppner
Have you played with bcache yet?
Enough to scare me. So many ways for things to go wrong, no easy way
to recover when things go wrong. And that's before I even get to
performance warts, like having systems stall completely because
there's tens or hundreds of GB of 4k random writes that have to be
flushed to slow SATA RAID6 in the cache....
Yikes. I hadn't yet heard such opinions expressed. By go wrong I
assume you mean the btrees or cached sector data getting broken, corrupted?
bcache is a complex filesystem hidden inside a block device. If
bcache goes AWOL, so does the all the data on your block device.
Need I say more?
Post by stan hoeppner
Post by Dave Chinner
PS: can you wrap your text at 68 or 72 columns so quoted text
doesn't overflow 80 columns and get randomly wrapped and messed up?
This email should be. Lemme see what I can do with the others. The
lovely Cisco VPN client I must use kills routing to my local subnet, so
Icedovce can't connect to my IMAP server when the VPN is active. The
test hardness app requires a shell unfortunately so I have to keep the
tunnel open all the time, as the test runs are 40+ hours each. My last
test just crashed a bit ago so I can use Icedove for this reply.
screen is your friend when it comes to keeping remote shells
active as the network comes and goes. VPN drops out, just bring it
back up when you need it and reconnect to the remote screen instance
and it's like you never left....

Cheers,

Dave.
--
Dave Chinner
***@fromorbit.com
stan hoeppner
2014-09-02 17:15:05 UTC
Permalink
Post by Dave Chinner
Post by stan hoeppner
Post by Dave Chinner
Post by Stan Hoeppner
Have you played with bcache yet?
Enough to scare me. So many ways for things to go wrong, no easy way
to recover when things go wrong. And that's before I even get to
performance warts, like having systems stall completely because
there's tens or hundreds of GB of 4k random writes that have to be
flushed to slow SATA RAID6 in the cache....
Yikes. I hadn't yet heard such opinions expressed. By go wrong I
assume you mean the btrees or cached sector data getting broken, corrupted?
bcache is a complex filesystem hidden inside a block device. If
bcache goes AWOL, so does the all the data on your block device.
Need I say more?
So it's no different in that regard than the black box implementations
such as LSI's CacheCade and various SAN vendor SSD caching
implementations. Or are you saying the bcache code complexity is so
much greater that failure is more likely that the vendor implementations?
Post by Dave Chinner
Post by stan hoeppner
Post by Dave Chinner
PS: can you wrap your text at 68 or 72 columns so quoted text
doesn't overflow 80 columns and get randomly wrapped and messed up?
This email should be. Lemme see what I can do with the others. The
lovely Cisco VPN client I must use kills routing to my local subnet, so
Icedovce can't connect to my IMAP server when the VPN is active. The
test hardness app requires a shell unfortunately so I have to keep the
tunnel open all the time, as the test runs are 40+ hours each. My last
test just crashed a bit ago so I can use Icedove for this reply.
screen is your friend when it comes to keeping remote shells
active as the network comes and goes. VPN drops out, just bring it
back up when you need it and reconnect to the remote screen instance
and it's like you never left....
Thanks for this tip. I'd heard of screen before but never used it. I
will say the man page is a bit intimidating for such an apparently
simple tool...

Stan
Dave Chinner
2014-09-02 22:19:15 UTC
Permalink
Post by stan hoeppner
Post by Dave Chinner
Post by stan hoeppner
Post by Dave Chinner
Post by Stan Hoeppner
Have you played with bcache yet?
Enough to scare me. So many ways for things to go wrong, no easy way
to recover when things go wrong. And that's before I even get to
performance warts, like having systems stall completely because
there's tens or hundreds of GB of 4k random writes that have to be
flushed to slow SATA RAID6 in the cache....
Yikes. I hadn't yet heard such opinions expressed. By go wrong I
assume you mean the btrees or cached sector data getting broken, corrupted?
bcache is a complex filesystem hidden inside a block device. If
bcache goes AWOL, so does the all the data on your block device.
Need I say more?
So it's no different in that regard than the black box implementations
such as LSI's CacheCade and various SAN vendor SSD caching
implementations. Or are you saying the bcache code complexity is so
much greater that failure is more likely that the vendor implementations?
No, not the code complexity in particular. It's more that compared
to vendor SSD caching implementations there's an awful lot less
testing and validation, and people tend to use random, unreliable
hardware for cache devices. It's great when it works, but the
configuration and validation of correct behaviour in error
conditions falls to the user...
Post by stan hoeppner
Post by Dave Chinner
screen is your friend when it comes to keeping remote shells
active as the network comes and goes. VPN drops out, just bring it
back up when you need it and reconnect to the remote screen instance
and it's like you never left....
Thanks for this tip. I'd heard of screen before but never used it. I
will say the man page is a bit intimidating for such an apparently
simple tool...
Yeah, I use about 0.0001% of what screen can do. It could lose most
of it's functionality and I wouldn't notice or care. tmux is another
option for this functionality, but I've never used it because I
found out about screen first...

Cheers,

Dave.
--
Dave Chinner
***@fromorbit.com
stan hoeppner
2014-09-07 05:23:03 UTC
Permalink
Post by Dave Chinner
Post by stan hoeppner
Post by Dave Chinner
Post by stan hoeppner
Post by Dave Chinner
Post by Stan Hoeppner
Have you played with bcache yet?
Enough to scare me. So many ways for things to go wrong, no easy way
to recover when things go wrong. And that's before I even get to
performance warts, like having systems stall completely because
there's tens or hundreds of GB of 4k random writes that have to be
flushed to slow SATA RAID6 in the cache....
Yikes. I hadn't yet heard such opinions expressed. By go wrong I
assume you mean the btrees or cached sector data getting broken, corrupted?
bcache is a complex filesystem hidden inside a block device. If
bcache goes AWOL, so does the all the data on your block device.
Need I say more?
So it's no different in that regard than the black box implementations
such as LSI's CacheCade and various SAN vendor SSD caching
implementations. Or are you saying the bcache code complexity is so
much greater that failure is more likely that the vendor implementations?
No, not the code complexity in particular. It's more that compared
to vendor SSD caching implementations there's an awful lot less
testing and validation, and people tend to use random, unreliable
hardware for cache devices. It's great when it works, but the
configuration and validation of correct behaviour in error
conditions falls to the user...
Understood. I'm seeing the potential need for a future contract with
Kent if we decide to go forward with bcache. He could advise on a
testing and validation regimen, optimizing for the workload, and
providing code fixes or features to overcome problems. Attempting to
use something so new as bcache in a 24x7 commercial workload likely
needs author support.
Post by Dave Chinner
Post by stan hoeppner
Post by Dave Chinner
screen is your friend when it comes to keeping remote shells
active as the network comes and goes. VPN drops out, just bring it
back up when you need it and reconnect to the remote screen instance
and it's like you never left....
Thanks for this tip. I'd heard of screen before but never used it. I
will say the man page is a bit intimidating for such an apparently
simple tool...
Yeah, I use about 0.0001% of what screen can do. It could lose most
of it's functionality and I wouldn't notice or care. tmux is another
option for this functionality, but I've never used it because I
found out about screen first...
I'd guess there are many utils out there used in the same way.


I have some more information regarding the AIO issue. I fired up the
test harness and it ran for 30 hours at 706 MB/s avg write rate, 303
MB/s per LUN, nearly flawlessly, less than 0.01% buffer loss, and avg IO
times were less than 0.5 seconds. Then the app crashed and I found the
following in dmesg. I had to "hard reset" the box due to the shrapnel.
There are no IO errors of any kind leading up to the forced shutdown.
I assume the inode update and streamRT-sa hung task traces are a result
of the forced shutdown, not a cause of it. In lieu of an xfs_repair
with a version newer than I'm able to install, any ideas what caused the
forced shutdown after 30 hours, given there are no errors preceding it?


Sep 6 06:33:33 Anguish-ssu-1 kernel: [288087.334863] XFS (dm-5):
xfs_do_force_shutdown(0x8) called from line 3732 of file
fs/xfs/xfs_bmap.c. Return address = 0xffffffffa02009a6
Sep 6 06:33:42 Anguish-ssu-1 kernel: [288096.220920] XFS (dm-5): failed
to update timestamps for inode 0x2ffc9caae
Sep 6 06:33:48 Anguish-ssu-1 kernel: [288102.492641] XFS (dm-5): failed
to update timestamps for inode 0x97b7566dd
Sep 6 06:35:41 Anguish-ssu-1 kernel: [288215.599412] INFO: task
streamRT-sa:14706 blocked for more than 120 seconds.
Sep 6 06:35:41 Anguish-ssu-1 kernel: [288215.599414] "echo 0 >
/proc/sys/kernel/hung_task_timeout_secs" disables this message.
Sep 6 06:35:41 Anguish-ssu-1 kernel: [288215.599416] streamRT-sa D
ffff883f3c018408 0 14706 14051 0x00000004
Sep 6 06:35:41 Anguish-ssu-1 kernel: [288215.599420] ffff883e6fc09b28
0000000000000086 0000000000000000 ffff8840666f5180
Sep 6 06:35:41 Anguish-ssu-1 kernel: [288215.599425] 0000000000000000
0000000000000000 00000000000122c0 00000000000122c0
Sep 6 06:35:41 Anguish-ssu-1 kernel: [288215.599428] ffff883e6fc09fd8
ffff883e6fc08000 00000000000122c0 ffff883e6fc08000
Sep 6 06:35:41 Anguish-ssu-1 kernel: [288215.599432] Call Trace:
Sep 6 06:35:41 Anguish-ssu-1 kernel: [288215.599441]
[<ffffffff814f5fd7>] schedule+0x64/0x66
Sep 6 06:35:41 Anguish-ssu-1 kernel: [288215.599443]
[<ffffffff814f66ec>] rwsem_down_failed_common+0xdb/0x10d
Sep 6 06:35:41 Anguish-ssu-1 kernel: [288215.599446]
[<ffffffff814f6731>] rwsem_down_write_failed+0x13/0x15
Sep 6 06:35:41 Anguish-ssu-1 kernel: [288215.599451]
[<ffffffff81261913>] call_rwsem_down_write_failed+0x13/0x20
Sep 6 06:35:41 Anguish-ssu-1 kernel: [288215.599454]
[<ffffffff814f5458>] ? down_write+0x25/0x27
Sep 6 06:35:41 Anguish-ssu-1 kernel: [288215.599466]
[<ffffffffa01e75e4>] xfs_ilock+0x4f/0xb4 [xfs]
Sep 6 06:35:41 Anguish-ssu-1 kernel: [288215.599472]
[<ffffffffa01e40e5>] xfs_rw_ilock+0x2c/0x33 [xfs]
Sep 6 06:35:41 Anguish-ssu-1 kernel: [288215.599476]
[<ffffffff814f6ac6>] ? _raw_spin_unlock_irq+0x27/0x32
Sep 6 06:35:41 Anguish-ssu-1 kernel: [288215.599481]
[<ffffffffa01e4519>] xfs_file_aio_write_checks+0x41/0xfe [xfs]
Sep 6 06:35:41 Anguish-ssu-1 kernel: [288215.599487]
[<ffffffffa01e46ff>] xfs_file_dio_aio_write+0x103/0x1fc [xfs]
Sep 6 06:35:41 Anguish-ssu-1 kernel: [288215.599493]
[<ffffffffa01e4ac3>] xfs_file_aio_write+0x152/0x1b5 [xfs]
Sep 6 06:35:41 Anguish-ssu-1 kernel: [288215.599499]
[<ffffffffa01e4971>] ? xfs_file_buffered_aio_write+0x179/0x179 [xfs]
Sep 6 06:35:41 Anguish-ssu-1 kernel: [288215.599503]
[<ffffffff81133694>] aio_rw_vect_retry+0x85/0x18a
Sep 6 06:35:41 Anguish-ssu-1 kernel: [288215.599505]
[<ffffffff8113360f>] ? aio_fsync+0x29/0x29
Sep 6 06:35:41 Anguish-ssu-1 kernel: [288215.599508]
[<ffffffff81134c10>] aio_run_iocb+0x7b/0x149
Sep 6 06:35:41 Anguish-ssu-1 kernel: [288215.599510]
[<ffffffff81134fe9>] io_submit_one+0x199/0x1f3
Sep 6 06:35:41 Anguish-ssu-1 kernel: [288215.599513]
[<ffffffff8113513d>] do_io_submit+0xfa/0x271
Sep 6 06:35:41 Anguish-ssu-1 kernel: [288215.599516]
[<ffffffff811352c4>] sys_io_submit+0x10/0x12
Sep 6 06:35:41 Anguish-ssu-1 kernel: [288215.599519]
[<ffffffff814fc912>] system_call_fastpath+0x16/0x1b
Sep 6 06:35:41 Anguish-ssu-1 kernel: [288215.599521] INFO: task
streamRT-sa:14713 blocked for more than 120 seconds.
Sep 6 06:35:41 Anguish-ssu-1 kernel: [288215.599523] "echo 0 >
/proc/sys/kernel/hung_task_timeout_secs" disables this message.
Sep 6 06:35:41 Anguish-ssu-1 kernel: [288215.599524] streamRT-sa D
ffff883b4f52ea48 0 14713 14051 0x00000004
Sep 6 06:35:41 Anguish-ssu-1 kernel: [288215.599527] ffff883e74af9b28
0000000000000086 0000000000000000 ffff884066622140
Sep 6 06:35:41 Anguish-ssu-1 kernel: [288215.599530] 0000000000000000
0000000000000000 00000000000122c0 00000000000122c0
Sep 6 06:35:41 Anguish-ssu-1 kernel: [288215.599534] ffff883e74af9fd8
ffff883e74af8000 00000000000122c0 ffff883e74af8000
Sep 6 06:35:41 Anguish-ssu-1 kernel: [288215.599537] Call Trace:
Sep 6 06:35:41 Anguish-ssu-1 kernel: [288215.599540]
[<ffffffff814f5fd7>] schedule+0x64/0x66
Sep 6 06:35:41 Anguish-ssu-1 kernel: [288215.599542]
[<ffffffff814f66ec>] rwsem_down_failed_common+0xdb/0x10d
Sep 6 06:35:41 Anguish-ssu-1 kernel: [288215.599544]
[<ffffffff814f6731>] rwsem_down_write_failed+0x13/0x15
Sep 6 06:35:41 Anguish-ssu-1 kernel: [288215.599547]
[<ffffffff81261913>] call_rwsem_down_write_failed+0x13/0x20
Sep 6 06:35:41 Anguish-ssu-1 kernel: [288215.599549]
[<ffffffff814f5458>] ? down_write+0x25/0x27
Sep 6 06:35:41 Anguish-ssu-1 kernel: [288215.599555]
[<ffffffffa01e75e4>] xfs_ilock+0x4f/0xb4 [xfs]
Sep 6 06:35:41 Anguish-ssu-1 kernel: [288215.599561]
[<ffffffffa01e40e5>] xfs_rw_ilock+0x2c/0x33 [xfs]
Sep 6 06:35:41 Anguish-ssu-1 kernel: [288215.599563]
[<ffffffff814f6ac6>] ? _raw_spin_unlock_irq+0x27/0x32
Sep 6 06:35:41 Anguish-ssu-1 kernel: [288215.599569]
[<ffffffffa01e4519>] xfs_file_aio_write_checks+0x41/0xfe [xfs]
Sep 6 06:35:41 Anguish-ssu-1 kernel: [288215.599575]
[<ffffffffa01e46ff>] xfs_file_dio_aio_write+0x103/0x1fc [xfs]
Sep 6 06:35:41 Anguish-ssu-1 kernel: [288215.599580]
[<ffffffffa01e4ac3>] xfs_file_aio_write+0x152/0x1b5 [xfs]
Sep 6 06:35:41 Anguish-ssu-1 kernel: [288215.599586]
[<ffffffffa01e4971>] ? xfs_file_buffered_aio_write+0x179/0x179 [xfs]
Sep 6 06:35:41 Anguish-ssu-1 kernel: [288215.599589]
[<ffffffff81133694>] aio_rw_vect_retry+0x85/0x18a
Sep 6 06:35:41 Anguish-ssu-1 kernel: [288215.599591]
[<ffffffff8113360f>] ? aio_fsync+0x29/0x29
Sep 6 06:35:41 Anguish-ssu-1 kernel: [288215.599593]
[<ffffffff81134c10>] aio_run_iocb+0x7b/0x149
Sep 6 06:35:41 Anguish-ssu-1 kernel: [288215.599596]
[<ffffffff81134fe9>] io_submit_one+0x199/0x1f3
Sep 6 06:35:41 Anguish-ssu-1 kernel: [288215.599598]
[<ffffffff8113513d>] do_io_submit+0xfa/0x271
Sep 6 06:35:41 Anguish-ssu-1 kernel: [288215.599601]
[<ffffffff811352c4>] sys_io_submit+0x10/0x12
Sep 6 06:35:41 Anguish-ssu-1 kernel: [288215.599603]
[<ffffffff814fc912>] system_call_fastpath+0x16/0x1b
Sep 6 06:35:41 Anguish-ssu-1 kernel: [288215.599605] INFO: task
streamRT-sa:14723 blocked for more than 120 seconds.
Sep 6 06:35:41 Anguish-ssu-1 kernel: [288215.599607] "echo 0 >
/proc/sys/kernel/hung_task_timeout_secs" disables this message.
Sep 6 06:35:41 Anguish-ssu-1 kernel: [288215.599608] streamRT-sa D
ffff883e754b2b88 0 14723 14051 0x00000004
Sep 6 06:35:41 Anguish-ssu-1 kernel: [288215.599610] ffff883e6fca3b28
0000000000000086 0000000000000000 ffff8840662521c0
Sep 6 06:35:41 Anguish-ssu-1 kernel: [288215.599614] 0000000000000000
0000000000000000 00000000000122c0 00000000000122c0
Sep 6 06:35:41 Anguish-ssu-1 kernel: [288215.599617] ffff883e6fca3fd8
ffff883e6fca2000 00000000000122c0 ffff883e6fca2000
Sep 6 06:35:41 Anguish-ssu-1 kernel: [288215.599620] Call Trace:
Sep 6 06:35:41 Anguish-ssu-1 kernel: [288215.599623]
[<ffffffff814f5fd7>] schedule+0x64/0x66
Sep 6 06:35:41 Anguish-ssu-1 kernel: [288215.599625]
[<ffffffff814f66ec>] rwsem_down_failed_common+0xdb/0x10d
Sep 6 06:35:41 Anguish-ssu-1 kernel: [288215.599628]
[<ffffffff814f6731>] rwsem_down_write_failed+0x13/0x15
Sep 6 06:35:41 Anguish-ssu-1 kernel: [288215.599630]
[<ffffffff81261913>] call_rwsem_down_write_failed+0x13/0x20
Sep 6 06:35:41 Anguish-ssu-1 kernel: [288215.599632]
[<ffffffff814f5458>] ? down_write+0x25/0x27
Sep 6 06:35:41 Anguish-ssu-1 kernel: [288215.599638]
[<ffffffffa01e75e4>] xfs_ilock+0x4f/0xb4 [xfs]
Sep 6 06:35:41 Anguish-ssu-1 kernel: [288215.599644]
[<ffffffffa01e40e5>] xfs_rw_ilock+0x2c/0x33 [xfs]
Sep 6 06:35:41 Anguish-ssu-1 kernel: [288215.599646]
[<ffffffff814f6ac6>] ? _raw_spin_unlock_irq+0x27/0x32
Sep 6 06:35:41 Anguish-ssu-1 kernel: [288215.599652]
[<ffffffffa01e4519>] xfs_file_aio_write_checks+0x41/0xfe [xfs]
Sep 6 06:35:41 Anguish-ssu-1 kernel: [288215.599657]
[<ffffffffa01e46ff>] xfs_file_dio_aio_write+0x103/0x1fc [xfs]
Sep 6 06:35:41 Anguish-ssu-1 kernel: [288215.599663]
[<ffffffffa01e4ac3>] xfs_file_aio_write+0x152/0x1b5 [xfs]
Sep 6 06:35:41 Anguish-ssu-1 kernel: [288215.599669]
[<ffffffffa01e4971>] ? xfs_file_buffered_aio_write+0x179/0x179 [xfs]
Sep 6 06:35:41 Anguish-ssu-1 kernel: [288215.599671]
[<ffffffff81133694>] aio_rw_vect_retry+0x85/0x18a
Sep 6 06:35:41 Anguish-ssu-1 kernel: [288215.599674]
[<ffffffff8113360f>] ? aio_fsync+0x29/0x29
Sep 6 06:35:41 Anguish-ssu-1 kernel: [288215.599676]
[<ffffffff81134c10>] aio_run_iocb+0x7b/0x149
Sep 6 06:35:41 Anguish-ssu-1 kernel: [288215.599678]
[<ffffffff81134fe9>] io_submit_one+0x199/0x1f3
Sep 6 06:35:41 Anguish-ssu-1 kernel: [288215.599681]
[<ffffffff8113513d>] do_io_submit+0xfa/0x271
Sep 6 06:35:41 Anguish-ssu-1 kernel: [288215.599684]
[<ffffffff811352c4>] sys_io_submit+0x10/0x12
Sep 6 06:35:41 Anguish-ssu-1 kernel: [288215.599686]
[<ffffffff814fc912>] system_call_fastpath+0x16/0x1b
Sep 6 06:35:41 Anguish-ssu-1 kernel: [288215.599688] INFO: task
streamRT-sa:14730 blocked for more than 120 seconds.
Sep 6 06:35:41 Anguish-ssu-1 kernel: [288215.599689] "echo 0 >
/proc/sys/kernel/hung_task_timeout_secs" disables this message.
Sep 6 06:35:41 Anguish-ssu-1 kernel: [288215.599691] streamRT-sa D
ffff883dc2360388 0 14730 14051 0x00000004
Sep 6 06:35:41 Anguish-ssu-1 kernel: [288215.599693] ffff883e6fde1b28
0000000000000086 0000000000000000 ffff884066043080
Sep 6 06:35:41 Anguish-ssu-1 kernel: [288215.599696] 0000000000000000
0000000000000000 00000000000122c0 00000000000122c0
Sep 6 06:35:41 Anguish-ssu-1 kernel: [288215.599700] ffff883e6fde1fd8
ffff883e6fde0000 00000000000122c0 ffff883e6fde0000
Sep 6 06:35:41 Anguish-ssu-1 kernel: [288215.599703] Call Trace:
Sep 6 06:35:41 Anguish-ssu-1 kernel: [288215.599705]
[<ffffffff814f5fd7>] schedule+0x64/0x66
Sep 6 06:35:41 Anguish-ssu-1 kernel: [288215.599708]
[<ffffffff814f66ec>] rwsem_down_failed_common+0xdb/0x10d
Sep 6 06:35:41 Anguish-ssu-1 kernel: [288215.599710]
[<ffffffff814f6731>] rwsem_down_write_failed+0x13/0x15
Sep 6 06:35:41 Anguish-ssu-1 kernel: [288215.599712]
[<ffffffff81261913>] call_rwsem_down_write_failed+0x13/0x20
Sep 6 06:35:41 Anguish-ssu-1 kernel: [288215.599715]
[<ffffffff814f5458>] ? down_write+0x25/0x27
Sep 6 06:35:41 Anguish-ssu-1 kernel: [288215.599720]
[<ffffffffa01e75e4>] xfs_ilock+0x4f/0xb4 [xfs]
Sep 6 06:35:41 Anguish-ssu-1 kernel: [288215.599726]
[<ffffffffa01e40e5>] xfs_rw_ilock+0x2c/0x33 [xfs]
Sep 6 06:35:41 Anguish-ssu-1 kernel: [288215.599728]
[<ffffffff814f6ac6>] ? _raw_spin_unlock_irq+0x27/0x32
Sep 6 06:35:41 Anguish-ssu-1 kernel: [288215.599734]
[<ffffffffa01e4519>] xfs_file_aio_write_checks+0x41/0xfe [xfs]
Sep 6 06:35:41 Anguish-ssu-1 kernel: [288215.599740]
[<ffffffffa01e46ff>] xfs_file_dio_aio_write+0x103/0x1fc [xfs]
Sep 6 06:35:41 Anguish-ssu-1 kernel: [288215.599745]
[<ffffffffa01e4ac3>] xfs_file_aio_write+0x152/0x1b5 [xfs]
Sep 6 06:35:41 Anguish-ssu-1 kernel: [288215.599751]
[<ffffffffa01e4971>] ? xfs_file_buffered_aio_write+0x179/0x179 [xfs]
Sep 6 06:35:41 Anguish-ssu-1 kernel: [288215.599754]
[<ffffffff81133694>] aio_rw_vect_retry+0x85/0x18a
Sep 6 06:35:41 Anguish-ssu-1 kernel: [288215.599756]
[<ffffffff8113360f>] ? aio_fsync+0x29/0x29
Sep 6 06:35:41 Anguish-ssu-1 kernel: [288215.599758]
[<ffffffff81134c10>] aio_run_iocb+0x7b/0x149
Sep 6 06:35:41 Anguish-ssu-1 kernel: [288215.599761]
[<ffffffff81134fe9>] io_submit_one+0x199/0x1f3
Sep 6 06:35:41 Anguish-ssu-1 kernel: [288215.599763]
[<ffffffff8113513d>] do_io_submit+0xfa/0x271
Sep 6 06:35:41 Anguish-ssu-1 kernel: [288215.599766]
[<ffffffff811352c4>] sys_io_submit+0x10/0x12
Sep 6 06:35:41 Anguish-ssu-1 kernel: [288215.599768]
[<ffffffff814fc912>] system_call_fastpath+0x16/0x1b
Sep 6 06:35:41 Anguish-ssu-1 kernel: [288215.599770] INFO: task
streamRT-sa:14733 blocked for more than 120 seconds.
Sep 6 06:35:41 Anguish-ssu-1 kernel: [288215.599771] "echo 0 >
/proc/sys/kernel/hung_task_timeout_secs" disables this message.
Sep 6 06:35:41 Anguish-ssu-1 kernel: [288215.599773] streamRT-sa D
ffff883e7555cb08 0 14733 14051 0x00000004
Sep 6 06:35:41 Anguish-ssu-1 kernel: [288215.599775] ffff883e7389db28
0000000000000086 0000000000000000 ffff88406663a040
Sep 6 06:35:41 Anguish-ssu-1 kernel: [288215.599778] 0000000000000000
0000000000000000 00000000000122c0 00000000000122c0
Sep 6 06:35:41 Anguish-ssu-1 kernel: [288215.599782] ffff883e7389dfd8
ffff883e7389c000 00000000000122c0 ffff883e7389c000
Sep 6 06:35:41 Anguish-ssu-1 kernel: [288215.599785] Call Trace:
Sep 6 06:35:41 Anguish-ssu-1 kernel: [288215.599787]
[<ffffffff814f5fd7>] schedule+0x64/0x66
Sep 6 06:35:41 Anguish-ssu-1 kernel: [288215.599790]
[<ffffffff814f66ec>] rwsem_down_failed_common+0xdb/0x10d
Sep 6 06:35:41 Anguish-ssu-1 kernel: [288215.599792]
[<ffffffff814f6731>] rwsem_down_write_failed+0x13/0x15
Sep 6 06:35:41 Anguish-ssu-1 kernel: [288215.599794]
[<ffffffff81261913>] call_rwsem_down_write_failed+0x13/0x20
Sep 6 06:35:41 Anguish-ssu-1 kernel: [288215.599797]
[<ffffffff814f5458>] ? down_write+0x25/0x27
Sep 6 06:35:41 Anguish-ssu-1 kernel: [288215.599802]
[<ffffffffa01e75e4>] xfs_ilock+0x4f/0xb4 [xfs]
Sep 6 06:35:41 Anguish-ssu-1 kernel: [288215.599808]
[<ffffffffa01e40e5>] xfs_rw_ilock+0x2c/0x33 [xfs]
Sep 6 06:35:41 Anguish-ssu-1 kernel: [288215.599811]
[<ffffffff814f6ac6>] ? _raw_spin_unlock_irq+0x27/0x32
Sep 6 06:35:41 Anguish-ssu-1 kernel: [288215.599816]
[<ffffffffa01e4519>] xfs_file_aio_write_checks+0x41/0xfe [xfs]
Sep 6 06:35:41 Anguish-ssu-1 kernel: [288215.599822]
[<ffffffffa01e46ff>] xfs_file_dio_aio_write+0x103/0x1fc [xfs]
Sep 6 06:35:41 Anguish-ssu-1 kernel: [288215.599827]
[<ffffffffa01e4ac3>] xfs_file_aio_write+0x152/0x1b5 [xfs]
Sep 6 06:35:41 Anguish-ssu-1 kernel: [288215.599833]
[<ffffffffa01e4971>] ? xfs_file_buffered_aio_write+0x179/0x179 [xfs]
Sep 6 06:35:41 Anguish-ssu-1 kernel: [288215.599836]
[<ffffffff81133694>] aio_rw_vect_retry+0x85/0x18a
Sep 6 06:35:41 Anguish-ssu-1 kernel: [288215.599838]
[<ffffffff8113360f>] ? aio_fsync+0x29/0x29
Sep 6 06:35:41 Anguish-ssu-1 kernel: [288215.599840]
[<ffffffff81134c10>] aio_run_iocb+0x7b/0x149
Sep 6 06:35:41 Anguish-ssu-1 kernel: [288215.599843]
[<ffffffff81134fe9>] io_submit_one+0x199/0x1f3
Sep 6 06:35:41 Anguish-ssu-1 kernel: [288215.599845]
[<ffffffff8113513d>] do_io_submit+0xfa/0x271
Sep 6 06:35:41 Anguish-ssu-1 kernel: [288215.599848]
[<ffffffff811352c4>] sys_io_submit+0x10/0x12
Sep 6 06:35:41 Anguish-ssu-1 kernel: [288215.599850]
[<ffffffff814fc912>] system_call_fastpath+0x16/0x1b
Sep 6 06:35:41 Anguish-ssu-1 kernel: [288215.599852] INFO: task
streamRT-sa:14736 blocked for more than 120 seconds.
Sep 6 06:35:41 Anguish-ssu-1 kernel: [288215.599853] "echo 0 >
/proc/sys/kernel/hung_task_timeout_secs" disables this message.
Sep 6 06:35:41 Anguish-ssu-1 kernel: [288215.599855] streamRT-sa D
ffff883e73915448 0 14736 14051 0x00000004
Sep 6 06:35:41 Anguish-ssu-1 kernel: [288215.599857] ffff883e73bb5b28
0000000000000086 0000000000000000 ffff884066709080
Sep 6 06:35:41 Anguish-ssu-1 kernel: [288215.599860] 000000025600a331
0000000000000000 00000000000122c0 00000000000122c0
Sep 6 06:35:41 Anguish-ssu-1 kernel: [288215.599864] ffff883e73bb5fd8
ffff883e73bb4000 00000000000122c0 ffff883e73bb4000
Sep 6 06:35:41 Anguish-ssu-1 kernel: [288215.599867] Call Trace:
Sep 6 06:35:41 Anguish-ssu-1 kernel: [288215.599870]
[<ffffffff814f5fd7>] schedule+0x64/0x66
Sep 6 06:35:41 Anguish-ssu-1 kernel: [288215.599872]
[<ffffffff814f66ec>] rwsem_down_failed_common+0xdb/0x10d
Sep 6 06:35:41 Anguish-ssu-1 kernel: [288215.599874]
[<ffffffff814f6731>] rwsem_down_write_failed+0x13/0x15
Sep 6 06:35:41 Anguish-ssu-1 kernel: [288215.599877]
[<ffffffff81261913>] call_rwsem_down_write_failed+0x13/0x20
Sep 6 06:35:41 Anguish-ssu-1 kernel: [288215.599879]
[<ffffffff814f5458>] ? down_write+0x25/0x27
Sep 6 06:35:41 Anguish-ssu-1 kernel: [288215.599885]
[<ffffffffa01e75e4>] xfs_ilock+0x4f/0xb4 [xfs]
Sep 6 06:35:41 Anguish-ssu-1 kernel: [288215.599890]
[<ffffffffa01e40e5>] xfs_rw_ilock+0x2c/0x33 [xfs]
Sep 6 06:35:41 Anguish-ssu-1 kernel: [288215.599892]
[<ffffffff814f6ac6>] ? _raw_spin_unlock_irq+0x27/0x32
Sep 6 06:35:41 Anguish-ssu-1 kernel: [288215.599898]
[<ffffffffa01e4519>] xfs_file_aio_write_checks+0x41/0xfe [xfs]
Sep 6 06:35:41 Anguish-ssu-1 kernel: [288215.599904]
[<ffffffffa01e46ff>] xfs_file_dio_aio_write+0x103/0x1fc [xfs]
Sep 6 06:35:41 Anguish-ssu-1 kernel: [288215.599909]
[<ffffffffa01e4ac3>] xfs_file_aio_write+0x152/0x1b5 [xfs]
Sep 6 06:35:41 Anguish-ssu-1 kernel: [288215.599915]
[<ffffffffa01e4971>] ? xfs_file_buffered_aio_write+0x179/0x179 [xfs]
Sep 6 06:35:41 Anguish-ssu-1 kernel: [288215.599918]
[<ffffffff81133694>] aio_rw_vect_retry+0x85/0x18a
Sep 6 06:35:41 Anguish-ssu-1 kernel: [288215.599920]
[<ffffffff8113360f>] ? aio_fsync+0x29/0x29
Sep 6 06:35:41 Anguish-ssu-1 kernel: [288215.599922]
[<ffffffff81134c10>] aio_run_iocb+0x7b/0x149
Sep 6 06:35:41 Anguish-ssu-1 kernel: [288215.599925]
[<ffffffff81134fe9>] io_submit_one+0x199/0x1f3
Sep 6 06:35:41 Anguish-ssu-1 kernel: [288215.599927]
[<ffffffff8113513d>] do_io_submit+0xfa/0x271
Sep 6 06:35:41 Anguish-ssu-1 kernel: [288215.599930]
[<ffffffff811352c4>] sys_io_submit+0x10/0x12
Sep 6 06:35:41 Anguish-ssu-1 kernel: [288215.599932]
[<ffffffff814fc912>] system_call_fastpath+0x16/0x1b
Sep 6 06:35:41 Anguish-ssu-1 kernel: [288215.599934] INFO: task
streamRT-sa:14738 blocked for more than 120 seconds.
Sep 6 06:35:41 Anguish-ssu-1 kernel: [288215.599936] "echo 0 >
/proc/sys/kernel/hung_task_timeout_secs" disables this message.
Sep 6 06:35:41 Anguish-ssu-1 kernel: [288215.599937] streamRT-sa D
ffff883f7c605488 0 14738 14051 0x00000004
Sep 6 06:35:41 Anguish-ssu-1 kernel: [288215.599939] ffff883c4cda7b28
0000000000000086 0000000000000000 ffff8840667bd1c0
Sep 6 06:35:41 Anguish-ssu-1 kernel: [288215.599943] 0000000000000000
0000000000000000 00000000000122c0 00000000000122c0
Sep 6 06:35:41 Anguish-ssu-1 kernel: [288215.599946] ffff883c4cda7fd8
ffff883c4cda6000 00000000000122c0 ffff883c4cda6000
Sep 6 06:35:41 Anguish-ssu-1 kernel: [288215.599949] Call Trace:
Sep 6 06:35:41 Anguish-ssu-1 kernel: [288215.599952]
[<ffffffff814f5fd7>] schedule+0x64/0x66
Sep 6 06:35:41 Anguish-ssu-1 kernel: [288215.599954]
[<ffffffff814f66ec>] rwsem_down_failed_common+0xdb/0x10d
Sep 6 06:35:41 Anguish-ssu-1 kernel: [288215.599956]
[<ffffffff814f6731>] rwsem_down_write_failed+0x13/0x15
Sep 6 06:35:41 Anguish-ssu-1 kernel: [288215.599959]
[<ffffffff81261913>] call_rwsem_down_write_failed+0x13/0x20
Sep 6 06:35:41 Anguish-ssu-1 kernel: [288215.599961]
[<ffffffff814f5458>] ? down_write+0x25/0x27
Sep 6 06:35:41 Anguish-ssu-1 kernel: [288215.599967]
[<ffffffffa01e75e4>] xfs_ilock+0x4f/0xb4 [xfs]
Sep 6 06:35:41 Anguish-ssu-1 kernel: [288215.599972]
[<ffffffffa01e40e5>] xfs_rw_ilock+0x2c/0x33 [xfs]
Sep 6 06:35:41 Anguish-ssu-1 kernel: [288215.599975]
[<ffffffff814f6ac6>] ? _raw_spin_unlock_irq+0x27/0x32
Sep 6 06:35:41 Anguish-ssu-1 kernel: [288215.599980]
[<ffffffffa01e4519>] xfs_file_aio_write_checks+0x41/0xfe [xfs]
Sep 6 06:35:41 Anguish-ssu-1 kernel: [288215.599986]
[<ffffffffa01e46ff>] xfs_file_dio_aio_write+0x103/0x1fc [xfs]
Sep 6 06:35:41 Anguish-ssu-1 kernel: [288215.599991]
[<ffffffffa01e4ac3>] xfs_file_aio_write+0x152/0x1b5 [xfs]
Sep 6 06:35:41 Anguish-ssu-1 kernel: [288215.599997]
[<ffffffffa01e4971>] ? xfs_file_buffered_aio_write+0x179/0x179 [xfs]
Sep 6 06:35:41 Anguish-ssu-1 kernel: [288215.600000]
[<ffffffff81133694>] aio_rw_vect_retry+0x85/0x18a
Sep 6 06:35:41 Anguish-ssu-1 kernel: [288215.600002]
[<ffffffff8113360f>] ? aio_fsync+0x29/0x29
Sep 6 06:35:41 Anguish-ssu-1 kernel: [288215.600004]
[<ffffffff81134c10>] aio_run_iocb+0x7b/0x149
Sep 6 06:35:41 Anguish-ssu-1 kernel: [288215.600007]
[<ffffffff81134fe9>] io_submit_one+0x199/0x1f3
Sep 6 06:35:41 Anguish-ssu-1 kernel: [288215.600009]
[<ffffffff8113513d>] do_io_submit+0xfa/0x271
Sep 6 06:35:41 Anguish-ssu-1 kernel: [288215.600012]
[<ffffffff811352c4>] sys_io_submit+0x10/0x12
Sep 6 06:35:41 Anguish-ssu-1 kernel: [288215.600014]
[<ffffffff814fc912>] system_call_fastpath+0x16/0x1b
Sep 6 06:35:41 Anguish-ssu-1 kernel: [288215.600016] INFO: task
streamRT-sa:14739 blocked for more than 120 seconds.
Sep 6 06:35:41 Anguish-ssu-1 kernel: [288215.600018] "echo 0 >
/proc/sys/kernel/hung_task_timeout_secs" disables this message.
Sep 6 06:35:41 Anguish-ssu-1 kernel: [288215.600019] streamRT-sa D
ffff883e75536a08 0 14739 14051 0x00000004
Sep 6 06:35:41 Anguish-ssu-1 kernel: [288215.600021] ffff883b4f411b28
0000000000000086 0000000000000000 ffff884066739140
Sep 6 06:35:41 Anguish-ssu-1 kernel: [288215.600025] 0000000000000000
0000000000000000 00000000000122c0 00000000000122c0
Sep 6 06:35:41 Anguish-ssu-1 kernel: [288215.600028] ffff883b4f411fd8
ffff883b4f410000 00000000000122c0 ffff883b4f410000
Sep 6 06:35:41 Anguish-ssu-1 kernel: [288215.600031] Call Trace:
Sep 6 06:35:41 Anguish-ssu-1 kernel: [288215.600034]
[<ffffffff814f5fd7>] schedule+0x64/0x66
Sep 6 06:35:41 Anguish-ssu-1 kernel: [288215.600036]
[<ffffffff814f66ec>] rwsem_down_failed_common+0xdb/0x10d
Sep 6 06:35:41 Anguish-ssu-1 kernel: [288215.600038]
[<ffffffff814f6731>] rwsem_down_write_failed+0x13/0x15
Sep 6 06:35:41 Anguish-ssu-1 kernel: [288215.600041]
[<ffffffff81261913>] call_rwsem_down_write_failed+0x13/0x20
Sep 6 06:35:41 Anguish-ssu-1 kernel: [288215.600043]
[<ffffffff814f5458>] ? down_write+0x25/0x27
Sep 6 06:35:41 Anguish-ssu-1 kernel: [288215.600048]
[<ffffffffa01e75e4>] xfs_ilock+0x4f/0xb4 [xfs]
Sep 6 06:35:41 Anguish-ssu-1 kernel: [288215.600054]
[<ffffffffa01e40e5>] xfs_rw_ilock+0x2c/0x33 [xfs]
Sep 6 06:35:41 Anguish-ssu-1 kernel: [288215.600056]
[<ffffffff814f6ac6>] ? _raw_spin_unlock_irq+0x27/0x32
Sep 6 06:35:41 Anguish-ssu-1 kernel: [288215.600062]
[<ffffffffa01e4519>] xfs_file_aio_write_checks+0x41/0xfe [xfs]
Sep 6 06:35:41 Anguish-ssu-1 kernel: [288215.600068]
[<ffffffffa01e46ff>] xfs_file_dio_aio_write+0x103/0x1fc [xfs]
Sep 6 06:35:41 Anguish-ssu-1 kernel: [288215.600073]
[<ffffffffa01e4ac3>] xfs_file_aio_write+0x152/0x1b5 [xfs]
Sep 6 06:35:41 Anguish-ssu-1 kernel: [288215.600079]
[<ffffffffa01e4971>] ? xfs_file_buffered_aio_write+0x179/0x179 [xfs]
Sep 6 06:35:41 Anguish-ssu-1 kernel: [288215.600082]
[<ffffffff81133694>] aio_rw_vect_retry+0x85/0x18a
Sep 6 06:35:41 Anguish-ssu-1 kernel: [288215.600084]
[<ffffffff8113360f>] ? aio_fsync+0x29/0x29
Sep 6 06:35:41 Anguish-ssu-1 kernel: [288215.600086]
[<ffffffff81134c10>] aio_run_iocb+0x7b/0x149
Sep 6 06:35:41 Anguish-ssu-1 kernel: [288215.600089]
[<ffffffff81134fe9>] io_submit_one+0x199/0x1f3
Sep 6 06:35:41 Anguish-ssu-1 kernel: [288215.600091]
[<ffffffff8113513d>] do_io_submit+0xfa/0x271
Sep 6 06:35:41 Anguish-ssu-1 kernel: [288215.600094]
[<ffffffff811352c4>] sys_io_submit+0x10/0x12
Sep 6 06:35:41 Anguish-ssu-1 kernel: [288215.600096]
[<ffffffff814fc912>] system_call_fastpath+0x16/0x1b
Sep 6 06:35:41 Anguish-ssu-1 kernel: [288215.600099] INFO: task
streamRT-sa:14768 blocked for more than 120 seconds.
Sep 6 06:35:41 Anguish-ssu-1 kernel: [288215.600100] "echo 0 >
/proc/sys/kernel/hung_task_timeout_secs" disables this message.
Sep 6 06:35:41 Anguish-ssu-1 kernel: [288215.600101] streamRT-sa D
ffff883b5f120308 0 14768 14051 0x00000004
Sep 6 06:35:41 Anguish-ssu-1 kernel: [288215.600104] ffff883cca73bb28
0000000000000086 0000000000000000 ffffffff81813020
Sep 6 06:35:41 Anguish-ssu-1 kernel: [288215.600107] 0000000000000000
0000000000000000 00000000000122c0 00000000000122c0
Sep 6 06:35:41 Anguish-ssu-1 kernel: [288215.600110] ffff883cca73bfd8
ffff883cca73a000 00000000000122c0 ffff883cca73a000
Sep 6 06:35:41 Anguish-ssu-1 kernel: [288215.600113] Call Trace:
Sep 6 06:35:41 Anguish-ssu-1 kernel: [288215.600116]
[<ffffffff814f5fd7>] schedule+0x64/0x66
Sep 6 06:35:41 Anguish-ssu-1 kernel: [288215.600118]
[<ffffffff814f66ec>] rwsem_down_failed_common+0xdb/0x10d
Sep 6 06:35:41 Anguish-ssu-1 kernel: [288215.600120]
[<ffffffff814f6731>] rwsem_down_write_failed+0x13/0x15
Sep 6 06:35:41 Anguish-ssu-1 kernel: [288215.600123]
[<ffffffff81261913>] call_rwsem_down_write_failed+0x13/0x20
Sep 6 06:35:41 Anguish-ssu-1 kernel: [288215.600125]
[<ffffffff814f5458>] ? down_write+0x25/0x27
Sep 6 06:35:41 Anguish-ssu-1 kernel: [288215.600131]
[<ffffffffa01e75e4>] xfs_ilock+0x4f/0xb4 [xfs]
Sep 6 06:35:41 Anguish-ssu-1 kernel: [288215.600136]
[<ffffffffa01e40e5>] xfs_rw_ilock+0x2c/0x33 [xfs]
Sep 6 06:35:41 Anguish-ssu-1 kernel: [288215.600139]
[<ffffffff814f6ac6>] ? _raw_spin_unlock_irq+0x27/0x32
Sep 6 06:35:41 Anguish-ssu-1 kernel: [288215.600144]
[<ffffffffa01e4519>] xfs_file_aio_write_checks+0x41/0xfe [xfs]
Sep 6 06:35:41 Anguish-ssu-1 kernel: [288215.600150]
[<ffffffffa01e46ff>] xfs_file_dio_aio_write+0x103/0x1fc [xfs]
Sep 6 06:35:41 Anguish-ssu-1 kernel: [288215.600156]
[<ffffffffa01e4ac3>] xfs_file_aio_write+0x152/0x1b5 [xfs]
Sep 6 06:35:41 Anguish-ssu-1 kernel: [288215.600161]
[<ffffffffa01e4971>] ? xfs_file_buffered_aio_write+0x179/0x179 [xfs]
Sep 6 06:35:41 Anguish-ssu-1 kernel: [288215.600164]
[<ffffffff81133694>] aio_rw_vect_retry+0x85/0x18a
Sep 6 06:35:41 Anguish-ssu-1 kernel: [288215.600166]
[<ffffffff8113360f>] ? aio_fsync+0x29/0x29
Sep 6 06:35:41 Anguish-ssu-1 kernel: [288215.600168]
[<ffffffff81134c10>] aio_run_iocb+0x7b/0x149
Sep 6 06:35:41 Anguish-ssu-1 kernel: [288215.600171]
[<ffffffff81134fe9>] io_submit_one+0x199/0x1f3
Sep 6 06:35:41 Anguish-ssu-1 kernel: [288215.600173]
[<ffffffff8113513d>] do_io_submit+0xfa/0x271
Sep 6 06:35:41 Anguish-ssu-1 kernel: [288215.600176]
[<ffffffff811352c4>] sys_io_submit+0x10/0x12
Sep 6 06:35:41 Anguish-ssu-1 kernel: [288215.600178]
[<ffffffff814fc912>] system_call_fastpath+0x16/0x1b
Sep 6 06:35:41 Anguish-ssu-1 kernel: [288215.600180] INFO: task
streamRT-sa:14789 blocked for more than 120 seconds.
Sep 6 06:35:41 Anguish-ssu-1 kernel: [288215.600181] "echo 0 >
/proc/sys/kernel/hung_task_timeout_secs" disables this message.
Sep 6 06:35:41 Anguish-ssu-1 kernel: [288215.600183] streamRT-sa D
ffff883cca430b08 0 14789 14051 0x00000004
Sep 6 06:35:41 Anguish-ssu-1 kernel: [288215.600185] ffff883f3d9c3b28
0000000000000086 0000000000000000 ffff884066739140
Sep 6 06:35:41 Anguish-ssu-1 kernel: [288215.600188] 0000000000000000
0000000000000000 00000000000122c0 00000000000122c0
Sep 6 06:35:41 Anguish-ssu-1 kernel: [288215.600192] ffff883f3d9c3fd8
ffff883f3d9c2000 00000000000122c0 ffff883f3d9c2000
Sep 6 06:35:41 Anguish-ssu-1 kernel: [288215.600195] Call Trace:
Sep 6 06:35:41 Anguish-ssu-1 kernel: [288215.600197]
[<ffffffff814f5fd7>] schedule+0x64/0x66
Sep 6 06:35:41 Anguish-ssu-1 kernel: [288215.600200]
[<ffffffff814f66ec>] rwsem_down_failed_common+0xdb/0x10d
Sep 6 06:35:41 Anguish-ssu-1 kernel: [288215.600202]
[<ffffffff814f6731>] rwsem_down_write_failed+0x13/0x15
Sep 6 06:35:41 Anguish-ssu-1 kernel: [288215.600204]
[<ffffffff81261913>] call_rwsem_down_write_failed+0x13/0x20
Sep 6 06:35:41 Anguish-ssu-1 kernel: [288215.600207]
[<ffffffff814f5458>] ? down_write+0x25/0x27
Sep 6 06:35:41 Anguish-ssu-1 kernel: [288215.600212]
[<ffffffffa01e75e4>] xfs_ilock+0x4f/0xb4 [xfs]
Sep 6 06:35:41 Anguish-ssu-1 kernel: [288215.600218]
[<ffffffffa01e40e5>] xfs_rw_ilock+0x2c/0x33 [xfs]
Sep 6 06:35:41 Anguish-ssu-1 kernel: [288215.600220]
[<ffffffff814f6ac6>] ? _raw_spin_unlock_irq+0x27/0x32
Sep 6 06:35:41 Anguish-ssu-1 kernel: [288215.600226]
[<ffffffffa01e4519>] xfs_file_aio_write_checks+0x41/0xfe [xfs]
Sep 6 06:35:41 Anguish-ssu-1 kernel: [288215.600231]
[<ffffffffa01e46ff>] xfs_file_dio_aio_write+0x103/0x1fc [xfs]
Sep 6 06:35:41 Anguish-ssu-1 kernel: [288215.600237]
[<ffffffffa01e4ac3>] xfs_file_aio_write+0x152/0x1b5 [xfs]
Sep 6 06:35:41 Anguish-ssu-1 kernel: [288215.600243]
[<ffffffffa01e4971>] ? xfs_file_buffered_aio_write+0x179/0x179 [xfs]
Sep 6 06:35:41 Anguish-ssu-1 kernel: [288215.600245]
[<ffffffff81133694>] aio_rw_vect_retry+0x85/0x18a
Sep 6 06:35:41 Anguish-ssu-1 kernel: [288215.600248]
[<ffffffff8113360f>] ? aio_fsync+0x29/0x29
Sep 6 06:35:41 Anguish-ssu-1 kernel: [288215.600250]
[<ffffffff81134c10>] aio_run_iocb+0x7b/0x149
Sep 6 06:35:41 Anguish-ssu-1 kernel: [288215.600252]
[<ffffffff81134fe9>] io_submit_one+0x199/0x1f3
Sep 6 06:35:41 Anguish-ssu-1 kernel: [288215.600255]
[<ffffffff8113513d>] do_io_submit+0xfa/0x271
Sep 6 06:35:41 Anguish-ssu-1 kernel: [288215.600258]
[<ffffffff811352c4>] sys_io_submit+0x10/0x12
Sep 6 06:35:41 Anguish-ssu-1 kernel: [288215.600260]
[<ffffffff814fc912>] system_call_fastpath+0x16/0x1b
Sep 6 15:42:02 Anguish-ssu-1 kernel: [320925.045195] SysRq : Resetting


Thanks,
Stan
Dave Chinner
2014-09-07 23:39:10 UTC
Permalink
Post by stan hoeppner
I have some more information regarding the AIO issue. I fired up the
test harness and it ran for 30 hours at 706 MB/s avg write rate, 303
MB/s per LUN, nearly flawlessly, less than 0.01% buffer loss, and avg IO
times were less than 0.5 seconds. Then the app crashed and I found the
following in dmesg. I had to "hard reset" the box due to the shrapnel.
There are no IO errors of any kind leading up to the forced shutdown.
I assume the inode update and streamRT-sa hung task traces are a result
of the forced shutdown, not a cause of it. In lieu of an xfs_repair
with a version newer than I'm able to install, any ideas what caused the
forced shutdown after 30 hours, given there are no errors preceding it?
xfs_do_force_shutdown(0x8) called from line 3732 of file
fs/xfs/xfs_bmap.c. Return address = 0xffffffffa02009a6
Sep 6 06:33:42 Anguish-ssu-1 kernel: [288096.220920] XFS (dm-5): failed
to update timestamps for inode 0x2ffc9caae
Hi Stan, can you need to turn off line wrapping for stuff you paste
in? It's all but unreadable when it line wraps like this?

Next, you need to turn /proc/sys/fs/xfs/error_level up to 11 so that
it dumps a stack trace on corruption events. I don't have a (I can't
remember what kernel version you are running) tree in front of me to
convert that line number to something meaningful, so it's not a
great help...

Was there anything in the logs before the shutdown? i.e. can you
paste the dmesg output from the start of the test (i.e. the mount of
the fs) to the end?
Post by stan hoeppner
[<ffffffff814f5fd7>] schedule+0x64/0x66
[<ffffffff814f66ec>] rwsem_down_failed_common+0xdb/0x10d
[<ffffffff814f6731>] rwsem_down_write_failed+0x13/0x15
[<ffffffff81261913>] call_rwsem_down_write_failed+0x13/0x20
[<ffffffff814f5458>] ? down_write+0x25/0x27
[<ffffffffa01e75e4>] xfs_ilock+0x4f/0xb4 [xfs]
[<ffffffffa01e40e5>] xfs_rw_ilock+0x2c/0x33 [xfs]
[<ffffffff814f6ac6>] ? _raw_spin_unlock_irq+0x27/0x32
[<ffffffffa01e4519>] xfs_file_aio_write_checks+0x41/0xfe [xfs]
[<ffffffffa01e46ff>] xfs_file_dio_aio_write+0x103/0x1fc [xfs]
[<ffffffffa01e4ac3>] xfs_file_aio_write+0x152/0x1b5 [xfs]
[<ffffffffa01e4971>] ? xfs_file_buffered_aio_write+0x179/0x179 [xfs]
[<ffffffff81133694>] aio_rw_vect_retry+0x85/0x18a
[<ffffffff8113360f>] ? aio_fsync+0x29/0x29
[<ffffffff81134c10>] aio_run_iocb+0x7b/0x149
[<ffffffff81134fe9>] io_submit_one+0x199/0x1f3
[<ffffffff8113513d>] do_io_submit+0xfa/0x271
[<ffffffff811352c4>] sys_io_submit+0x10/0x12
[<ffffffff814fc912>] system_call_fastpath+0x16/0x1b
Which implies that the shutdown didn't unlock the inode correctly.
But without knowing what the call stack at the time of the shutdown
was, I can't really tell...

Cheers,

Dave.
--
Dave Chinner
***@fromorbit.com
stan hoeppner
2014-09-08 15:13:16 UTC
Permalink
Post by Dave Chinner
Post by stan hoeppner
I have some more information regarding the AIO issue. I fired up the
test harness and it ran for 30 hours at 706 MB/s avg write rate, 303
MB/s per LUN, nearly flawlessly, less than 0.01% buffer loss, and avg IO
times were less than 0.5 seconds. Then the app crashed and I found the
following in dmesg. I had to "hard reset" the box due to the shrapnel.
There are no IO errors of any kind leading up to the forced shutdown.
I assume the inode update and streamRT-sa hung task traces are a result
of the forced shutdown, not a cause of it. In lieu of an xfs_repair
with a version newer than I'm able to install, any ideas what caused the
forced shutdown after 30 hours, given there are no errors preceding it?
xfs_do_force_shutdown(0x8) called from line 3732 of file
fs/xfs/xfs_bmap.c. Return address = 0xffffffffa02009a6
Sep 6 06:33:42 Anguish-ssu-1 kernel: [288096.220920] XFS (dm-5): failed
to update timestamps for inode 0x2ffc9caae
Hi Stan, can you need to turn off line wrapping for stuff you paste
in? It's all but unreadable when it line wraps like this?
Sorry. I switched my daily desktop from Windows/Tbird to Wheezy/Icedove
and I haven't tweaked it out much yet. I set hard wrap at 72 and that's
the problem. I'll set flowed format and see if that helps.
Post by Dave Chinner
Next, you need to turn /proc/sys/fs/xfs/error_level up to 11 so that
it dumps a stack trace on corruption events. I don't have a (I can't
remember what kernel version you are running) tree in front of me to
convert that line number to something meaningful, so it's not a
great help...
error_level is now 11 on both systems and will survive reboots. It's
kernel 3.4.26.
Post by Dave Chinner
Was there anything in the logs before the shutdown? i.e. can you
paste the dmesg output from the start of the test (i.e. the mount of
the fs) to the end?
They have this setup in a quasi production/test manner, which is
frustrating. The two test rigs PXE/tftp boot and mount rootfs on NFS.
Both systems remote log kern.log into to a single file on the boot
server, so I grep for hostname. dmesg isn't logged remotely, and is
lost after a reboot. So I don't have the mount entries for some reason.
It seems kern.log doesn't get populated with all the stuff that goes
into dmesg. I'll be sure to grab all of dmesg next time before
rebooting. However, I don't recall any errors of any kind prior to the
shutdown, which in itself is strange.
Post by Dave Chinner
Post by stan hoeppner
[<ffffffff814f5fd7>] schedule+0x64/0x66
[<ffffffff814f66ec>] rwsem_down_failed_common+0xdb/0x10d
[<ffffffff814f6731>] rwsem_down_write_failed+0x13/0x15
[<ffffffff81261913>] call_rwsem_down_write_failed+0x13/0x20
[<ffffffff814f5458>] ? down_write+0x25/0x27
[<ffffffffa01e75e4>] xfs_ilock+0x4f/0xb4 [xfs]
[<ffffffffa01e40e5>] xfs_rw_ilock+0x2c/0x33 [xfs]
[<ffffffff814f6ac6>] ? _raw_spin_unlock_irq+0x27/0x32
[<ffffffffa01e4519>] xfs_file_aio_write_checks+0x41/0xfe [xfs]
[<ffffffffa01e46ff>] xfs_file_dio_aio_write+0x103/0x1fc [xfs]
[<ffffffffa01e4ac3>] xfs_file_aio_write+0x152/0x1b5 [xfs]
[<ffffffffa01e4971>] ? xfs_file_buffered_aio_write+0x179/0x179 [xfs]
[<ffffffff81133694>] aio_rw_vect_retry+0x85/0x18a
[<ffffffff8113360f>] ? aio_fsync+0x29/0x29
[<ffffffff81134c10>] aio_run_iocb+0x7b/0x149
[<ffffffff81134fe9>] io_submit_one+0x199/0x1f3
[<ffffffff8113513d>] do_io_submit+0xfa/0x271
[<ffffffff811352c4>] sys_io_submit+0x10/0x12
[<ffffffff814fc912>] system_call_fastpath+0x16/0x1b
Which implies that the shutdown didn't unlock the inode correctly.
But without knowing what the call stack at the time of the shutdown
was, I can't really tell...
And error_level 11 should give us the call stack, correct?

My current run without AIO should be completing in a few hours. I have
one more non-AIO run to make after that. Takes about 6 hours after all
files are populated. Then I'll start another AIO run and try to get you
the info you need. May take 30 hours again, might take less than an
hour. It's not consistent.

Thanks,
Stan

Loading...