106 lines
6.7 KiB
Plaintext
106 lines
6.7 KiB
Plaintext
+ /opt/ddn/mvapich/bin/mpiexec -ppn 6 -np 12 -genv MV2_NUM_HCAS 1 -genv MV2_CPU_BINDING_LEVEL core -genv MV2_CPU_BINDING_POLICY scatter --hosts isc17-c04,isc17-c05 /esfs/jtacquaviva/software/install/ior/git-ddn/bin/ior -i 3 -s 1 -t 102400 -b 23530045440 -D 120 -a MPIIO -e -g -z -k -o /esfs/jtacquaviva/ioperf/file_write -w
|
|
+ tee -a ./output/COUNT:1#NN:2#PPN:6#API:MPIIO#T:102400.txt
|
|
IOR-3.0.1: MPI Coordinated Test of Parallel I/O
|
|
|
|
ior ERROR: block size must be a multiple of transfer size, errno 2, No such file or directory (ior.c:2293)
|
|
ior ERROR: block size must be a multiple of transfer size, errno 0, Success (ior.c:2293)
|
|
ior ERROR: block size must be a multiple of transfer size, errno 2, No such file or directory (ior.c:2293)
|
|
ior ERROR: block size must be a multiple of transfer size, errno 2, No such file or directory (ior.c:2293)
|
|
ior ERROR: block size must be a multiple of transfer size, errno 2, No such file or directory (ior.c:2293)
|
|
ior ERROR: block size must be a multiple of transfer size, errno 2, No such file or directory (ior.c:2293)
|
|
ior ERROR: block size must be a multiple of transfer size, errno 0, Success (ior.c:2293)
|
|
ior ERROR: block size must be a multiple of transfer size, errno 2, No such file or directory (ior.c:2293)
|
|
ior ERROR: block size must be a multiple of transfer size, errno 2, No such file or directory (ior.c:2293)
|
|
ior ERROR: block size must be a multiple of transfer size, errno 2, No such file or directory (ior.c:2293)
|
|
ior ERROR: block size must be a multiple of transfer size, errno 2, No such file or directory (ior.c:2293)
|
|
ior ERROR: block size must be a multiple of transfer size, errno 2, No such file or directory (ior.c:2293)
|
|
[cli_7]: aborting job:
|
|
application called MPI_Abort(MPI_COMM_WORLD, -1) - process 7
|
|
[cli_3]: [cli_9]: aborting job:
|
|
application called MPI_Abort(MPI_COMM_WORLD, -1) - process 9
|
|
[cli_5]: aborting job:
|
|
application called MPI_Abort(MPI_COMM_WORLD, -1) - process 5
|
|
[cli_11]: aborting job:
|
|
application called MPI_Abort(MPI_COMM_WORLD, -1) - process 11
|
|
[cli_1]: aborting job:
|
|
application called MPI_Abort(MPI_COMM_WORLD, -1) - process 1
|
|
[cli_6]: aborting job:
|
|
application called MPI_Abort(MPI_COMM_WORLD, -1) - process 6
|
|
aborting job:
|
|
application called MPI_Abort(MPI_COMM_WORLD, -1) - process 3
|
|
[cli_8]: aborting job:
|
|
application called MPI_Abort(MPI_COMM_WORLD, -1) - process 8
|
|
[cli_2]: aborting job:
|
|
application called MPI_Abort(MPI_COMM_WORLD, -1) - process 2
|
|
[cli_10]: aborting job:
|
|
application called MPI_Abort(MPI_COMM_WORLD, -1) - process 10
|
|
[cli_4]: aborting job:
|
|
application called MPI_Abort(MPI_COMM_WORLD, -1) - process 4
|
|
[cli_0]: aborting job:
|
|
application called MPI_Abort(MPI_COMM_WORLD, -1) - process 0
|
|
|
|
===================================================================================
|
|
= BAD TERMINATION OF ONE OF YOUR APPLICATION PROCESSES
|
|
= PID 31487 RUNNING AT isc17-c05
|
|
= EXIT CODE: 255
|
|
= CLEANING UP REMAINING PROCESSES
|
|
= YOU CAN IGNORE THE BELOW CLEANUP MESSAGES
|
|
===================================================================================
|
|
[proxy:0:0@isc17-c04] HYDU_sock_write (utils/sock/sock.c:286): write error (Broken pipe)
|
|
[proxy:0:0@isc17-c04] main (pm/pmiserv/pmip.c:265): unable to send EXIT_STATUS command upstream
|
|
+ /opt/ddn/mvapich/bin/mpiexec -ppn 6 -np 12 -genv MV2_NUM_HCAS 1 -genv MV2_CPU_BINDING_LEVEL core -genv MV2_CPU_BINDING_POLICY scatter --hosts isc17-c04,isc17-c05 /esfs/jtacquaviva/git/ime-evaluation/drop_caches.sh
|
|
+ /opt/ddn/mvapich/bin/mpiexec -ppn 6 -np 12 -genv MV2_NUM_HCAS 1 -genv MV2_CPU_BINDING_LEVEL core -genv MV2_CPU_BINDING_POLICY scatter --hosts isc17-c04,isc17-c05 /esfs/jtacquaviva/software/install/ior/git-ddn/bin/ior -i 3 -s 1 -t 102400 -b 23530045440 -D 120 -a MPIIO -e -g -z -k -o /esfs/jtacquaviva/file_read -r
|
|
+ tee -a ./output/COUNT:1#NN:2#PPN:6#API:MPIIO#T:102400.txt
|
|
IOR-3.0.1: MPI Coordinated Test of Parallel I/O
|
|
|
|
ior ERROR: block size must be a multiple of transfer size, errno 2, No such file or directory (ior.c:2293)
|
|
ior ERROR: block size must be a multiple of transfer size, errno 2, No such file or directory (ior.c:2293)
|
|
ior ERROR: block size must be a multiple of transfer size, errno 2, No such file or directory (ior.c:2293)
|
|
ior ERROR: block size must be a multiple of transfer size, errno 2, No such file or directory (ior.c:2293)
|
|
ior ERROR: block size must be a multiple of transfer size, errno 2, No such file or directory (ior.c:2293)
|
|
ior ERROR: block size must be a multiple of transfer size, errno 0, Success (ior.c:2293)
|
|
ior ERROR: block size must be a multiple of transfer size, errno 2, No such file or directory (ior.c:2293)
|
|
ior ERROR: block size must be a multiple of transfer size, errno 2, No such file or directory (ior.c:2293)
|
|
ior ERROR: block size must be a multiple of transfer size, errno 2, No such file or directory (ior.c:2293)
|
|
ior ERROR: block size must be a multiple of transfer size, errno 2, No such file or directory (ior.c:2293)
|
|
ior ERROR: block size must be a multiple of transfer size, errno 2, No such file or directory (ior.c:2293)
|
|
ior ERROR: block size must be a multiple of transfer size, errno 0, Success (ior.c:2293)
|
|
[cli_6]: aborting job:
|
|
application called MPI_Abort(MPI_COMM_WORLD, -1) - process 6
|
|
[cli_7]: aborting job:
|
|
application called MPI_Abort(MPI_COMM_WORLD, -1) - process 7
|
|
[cli_8]: aborting job:
|
|
application called MPI_Abort(MPI_COMM_WORLD, -1) - process 8
|
|
[cli_10]: aborting job:
|
|
application called MPI_Abort(MPI_COMM_WORLD, -1) - process 10
|
|
[cli_11]: aborting job:
|
|
application called MPI_Abort(MPI_COMM_WORLD, -1) - process 11
|
|
[cli_9]: aborting job:
|
|
application called MPI_Abort(MPI_COMM_WORLD, -1) - process 9
|
|
[cli_1]: aborting job:
|
|
application called MPI_Abort(MPI_COMM_WORLD, -1) - process 1
|
|
[cli_5]: aborting job:
|
|
application called MPI_Abort(MPI_COMM_WORLD, -1) - process 5
|
|
[cli_0]: aborting job:
|
|
application called MPI_Abort(MPI_COMM_WORLD, -1) - process 0
|
|
[cli_2]: aborting job:
|
|
application called MPI_Abort(MPI_COMM_WORLD, -1) - process 2
|
|
[cli_3]: aborting job:
|
|
application called MPI_Abort(MPI_COMM_WORLD, -1) - process 3
|
|
[cli_4]: aborting job:
|
|
application called MPI_Abort(MPI_COMM_WORLD, -1) - process 4
|
|
|
|
===================================================================================
|
|
= BAD TERMINATION OF ONE OF YOUR APPLICATION PROCESSES
|
|
= PID 31572 RUNNING AT isc17-c05
|
|
= EXIT CODE: 255
|
|
= CLEANING UP REMAINING PROCESSES
|
|
= YOU CAN IGNORE THE BELOW CLEANUP MESSAGES
|
|
===================================================================================
|
|
[proxy:0:0@isc17-c04] HYD_pmcd_pmip_control_cmd_cb (pm/pmiserv/pmip_cb.c:912): assert (!closed) failed
|
|
[proxy:0:0@isc17-c04] HYDT_dmxu_poll_wait_for_event (tools/demux/demux_poll.c:76): callback returned error status
|
|
[proxy:0:0@isc17-c04] main (pm/pmiserv/pmip.c:256): demux engine error waiting for event
|
|
+ set +x
|
|
/esfs/jtacquaviva/ioperf
|
|
stripe_count: 4 stripe_size: 1048576 stripe_offset: -1
|