LAMMPS WWW Site - LAMMPS Documentation - LAMMPS Mailing List Archives
Re: [lammps-users] [EXTERNAL] Re: GCMC with Reax
[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

Re: [lammps-users] [EXTERNAL] Re: GCMC with Reax


From: "Moore, Stan" <stamoor@...3...>
Date: Tue, 18 Jul 2017 18:12:32 +0000

Sencer,

You also can build the KOKKOS package with Makefile.kokkos_mpi_only and don't need to worry about threading. We are happy to look into this when you send us a reproducer input deck.

Thanks,

Stan

-----Original Message-----
From: Axel Kohlmeyer [mailto:akohlmey@...24...] 
Sent: Tuesday, July 18, 2017 9:41 AM
To: Sencer Selcuk <sselcuk@...1683...>
Cc: Moore, Stan <stamoor@...3...>; LAMMPS Users Mailing List <lammps-users@lists.sourceforge.net>
Subject: Re: [EXTERNAL] Re: [lammps-users] GCMC with Reax

On Tue, Jul 18, 2017 at 9:47 AM, Sencer Selcuk <sselcuk@...1683...> wrote:
> Axel & Stan,
>
> Thank you for responding. Compiling with Kokkos (OMP) solved the 
> problem at least when I run without `overlap_cutoff`. This option 
> still causes a seg fault immediately - both with dev and stable versions.

not here. i can add the flag to any of the existing gcmc input examples and it works fine.

>
> Perhaps I should open another thread but: I cannot use threading with 
> Kokkos even a simple `fix nve` run. I add `-k on t 2 -sf kk` to the 
> command line arguments, and it hangs indefinitely. I was able to run 
> only with `-k on t 1 -sf kk`. Do I need anything else to do so?

please prefix your command line with: gdb --args and when gdb has launched, type: run let it run for a bit until you are certain it is hung, then hit CTRL-C and type: where and send us the screen output. that will tell in which subroutine it hangs.

for the segfault, please do the same thing. after lammps has segfaulted, type where and e-mail us the stack trace.

axel.

> I will try to reproduce the two problems with a smaller system and 
> send the data files etc. I just wanted to give a heads up.
>
> Best,
> Sencer
>
>
> On Mon, Jul 17, 2017 at 10:23 AM, Moore, Stan <stamoor@...3...> wrote:
>
> There are known problems with using GCMC with USER-REAXC. You should 
> try Kokkos ReaxFF instead, it is more memory robust and others have 
> had success with that. Stan ________________________________________ 
> From: Axel Kohlmeyer <akohlmey@...24...> Sent: Sunday, July 16, 2017 9:34 PM To:
> Sencer Selcuk Cc: LAMMPS Users Mailing List Subject: [EXTERNAL] Re:
> [lammps-users] GCMC with Reax On Sun, Jul 16, 2017 at 10:38 PM, Sencer 
> Selcuk <sselcuk@...1683...> wrote:
>
> Dear LAMMPS users, I am trying to set up a GCMC, and eventually a 
> GCMC/NVT grand-canonical calculation. I am using ReaxFF with 
> USER/ReaxC on a graphene oxide system with large defects, and trying 
> to insert C atoms into system with `fix gcmc' to model healing of the 
> defects. I understand that this is not the typical use case of this 
> fix, all the examples I have seen are prepared to study adsorption of 
> gas molecules etc. So, my first -or perhaps
> zeroth- question is, do you think it wouldn't be wise to try using 
> GCMC to study a problem like this? I use the following input file, and 
> my system (read by read_data data.in command) has no obvious problems 
> as it runs well with fix nvt. However, when I try to run the appended 
> input for GCMC calculation just hangs for hours. It creates the dump 
> file writing out the initial coordinates, it prints outs the thermo 
> line only for the first step, but does nothing else. Am I doing 
> anything wrong here? Finally, when I uncomment the overlap_cutoff 0.6 
> part, calculation immediately crashes with a segmentation fault. My 
> feeling is that the two problems are related, but I don't have any 
> clues - as such, I would appreciate any help! I am using the latest 
> stable LAMMPS version on a supercomputer, and running the job on a single node, full 16 processors.
>
> first step you need to do, is to try the same input with the latest 
> development version in order to verify that your issue is not already 
> solved. if the issue persists, please also provide the data file and 
> the potential file, so that a developer can try to reproduce it and 
> track down the origin of the segfault/hang. axel.
>
> Best, Sencer Postdoctoral Fellow Department of Chemistry Princeton 
> University units real atom_style charge boundary p p p read_data 
> data.in timestep 0.10 pair_style reax/c lmp_control lgvdw yes safezone 
> 1.6 mincap
> 150 pair_coeff * * reax.fgs C H O fix reax all qeq/reax 1 0.0 10.0 
> 1e-6 reax/c thermo 1000 thermo_style custom step time temp press vol 
> pe etotal enthalpy thermo_modify flush yes dump 1 all custom 1000 
> fgs.lammpstrj element xu yu zu dump_modify 1 sort id element C H O 
> append yes fix mc all gcmc 1 100 100 1 1824 1500.0 -1.0 1.0 # 
> overlap_cutoff 0.6 run 1000000
> ----------------------------------------------------------------------
> -------- Check out the vibrant tech community on one of the world's 
> most engaging tech sites, Slashdot.org! http://sdm.link/slashdot 
> _______________________________________________ lammps-users mailing 
> list lammps-users@lists.sourceforge.net
> https://lists.sourceforge.net/lists/listinfo/lammps-users
>
> -- Dr. Axel Kohlmeyer akohlmey@...24... http://goo.gl/1wk0 College of 
> Science & Technology, Temple University, Philadelphia PA, USA 
> International Centre for Theoretical Physics, Trieste. Italy.
> ----------------------------------------------------------------------
> -------- Check out the vibrant tech community on one of the world's 
> most engaging tech sites, Slashdot.org! http://sdm.link/slashdot 
> _______________________________________________ lammps-users mailing 
> list lammps-users@lists.sourceforge.net
> https://lists.sourceforge.net/lists/listinfo/lammps-users



--
Dr. Axel Kohlmeyer  akohlmey@...24...  http://goo.gl/1wk0 College of Science & Technology, Temple University, Philadelphia PA, USA International Centre for Theoretical Physics, Trieste. Italy.