:orphan: # VecCreateMPIViennaCLWithArrays Creates a parallel, array-style vector, where the user provides the ViennaCL vector to store the vector values. ## Synopsis ``` #include "petscvec.h" PetscErrorCode VecCreateMPIViennaCLWithArrays(MPI_Comm comm, PetscInt bs, PetscInt n, PetscInt N, const PetscScalar cpuarray[], const ViennaCLVector *viennaclvec, Vec *vv) ``` Collective ## Input Parameters - ***comm -*** the MPI communicator to use - ***bs -*** block size, same meaning as VecSetBlockSize() - ***n -*** local vector length, cannot be PETSC_DECIDE - ***N -*** global vector length (or PETSC_DECIDE to have calculated) - ***cpuarray -*** the user provided CPU array to store the vector values - ***viennaclvec -*** ViennaCL vector where the Vec entries are to be stored on the device. ## Output Parameter - ***vv -*** the vector ## Notes If both cpuarray and viennaclvec are provided, the caller must ensure that the provided arrays have identical values. Use VecDuplicate() or VecDuplicateVecs() to form additional vectors of the same type as an existing vector. PETSc does NOT free the provided arrays when the vector is destroyed via VecDestroy(). The user should not free the array until the vector is destroyed. ## See Also `VecCreateSeqViennaCLWithArrays()`, `VecCreateMPIWithArray()` `VecCreate()`, `VecDuplicate()`, `VecDuplicateVecs()`, `VecCreateGhost()`, `VecCreateMPI()`, `VecCreateGhostWithArray()`, `VecViennaCLPlaceArray()`, `VecPlaceArray()`, `VecCreateMPICUDAWithArrays()`, `VecViennaCLAllocateCheckHost()` ## Level intermediate ## Location src/vec/vec/impls/mpi/mpiviennacl/mpiviennacl.cxx --- [Edit on GitLab](https://gitlab.com/petsc/petsc/-/edit/release/src/vec/vec/impls/mpi/mpiviennacl/mpiviennacl.cxx) [Index of all Vec routines](index.md) [Table of Contents for all manual pages](/manualpages/index.md) [Index of all manual pages](/manualpages/singleindex.md)