Segmentation Fault WRFv3.9

Any issues with the actual running of the WRF.

Segmentation Fault WRFv3.9

Postby brro8458 » Tue Apr 24, 2018 6:46 pm

Hi all, similar story as to many others wrf.exe keeps having segmentation faults.

I am running 2 domains at fairly fine grids: 4km and 1km. WPS ran successfully, real.exe ran successfully. rsl.out file reports:

Code: Select all
Program received signal SIGSEGV: Segmentation fault - invalid memory reference.

Backtrace for this error:
#0  0x2B926A013547
#1  0x2B926A013B5E
#2  0x33FDA3265F
#3  0x185ADA6 in taugb3.6388 at module_ra_rrtmg_lw.f90:?
#4  0x187B03E in __rrtmg_lw_taumol_MOD_taumol
#5  0x1893938 in __rrtmg_lw_rad_MOD_rrtmg_lw
#6  0x18A41CA in __module_ra_rrtmg_lw_MOD_rrtmg_lwrad
#7  0x1455519 in __module_radiation_driver_MOD_radiation_driver
#8  0x150C830 in __module_first_rk_step_part1_MOD_first_rk_step_part1
#9  0x10A46CA in solve_em_
#10  0xF9AF1C in solve_interface_
#11  0x46B985 in __module_integrate_MOD_integrate
#12  0x407F03 in __module_wrf_top_MOD_wrf_run


namelist.wps
Code: Select all
&share
 wrf_core = 'ARW',
 max_dom = 2,
 start_date = '2015-07-16_00:00:00','2015-07-16_00:00:00',
 end_date   = '2015-09-07_18:00:00','2015-09-07_18:00:00',
 interval_seconds = 21600
 io_form_geogrid = 2,
/

&geogrid
 parent_id         =   1,   1,
 parent_grid_ratio =   1,   4,
 i_parent_start    =   1,  67,
 j_parent_start    =   1,  67,
 e_we              =  160, 113,
 e_sn              =  160, 113,
 !
 !!!!!!!!!!!!!!!!!!!!!!!!!!!! IMPORTANT NOTE !!!!!!!!!!!!!!!!!!!!!!!!!!!!
 ! The default datasets used to produce the HGT_M, GREENFRAC,
 ! and LU_INDEX/LANDUSEF fields have changed in WPS v3.8. The HGT_M field
 ! is now interpolated from 30-arc-second USGS GMTED2010, the GREENFRAC
 ! field is interpolated from MODIS FPAR, and the LU_INDEX/LANDUSEF fields
 ! are interpolated from 21-class MODIS.
 !
 ! To match the output given by the default namelist.wps in WPS v3.7.1,
 ! the following setting for geog_data_res may be used:
 !
 ! geog_data_res = 'gtopo_10m+usgs_10m+nesdis_greenfrac+10m','gtopo_2m+usgs_2m+nesdis_greenfrac+2m',
 !
 !!!!!!!!!!!!!!!!!!!!!!!!!!!! IMPORTANT NOTE !!!!!!!!!!!!!!!!!!!!!!!!!!!!
 !
 geog_data_res = 'gtopo_2m+usgs_2m+nesdis_greenfrac+2m','gtopo_30s+usgs_30s+nesdis_greenfrac+30s',
 dx = 4000,
 dy = 4000,
 map_proj = 'lambert',
 ref_lat   =  28.0338,
 ref_lon   =  77.2786,
 truelat1  =  26.0,
 truelat2  =  32.5,
 stand_lon =  77,
 geog_data_path = '/home/brooney/WRF/Build_WRF/WPS_GEOG/'
/



namelist.input
Code: Select all
&time_control
 run_days                            = 53,
 run_hours                           = 18,
 run_minutes                         = 0,
 run_seconds                         = 0,
 start_year                          = 2015, 2015,
 start_month                         = 07,   07,
 start_day                           = 16,   16, 
 start_hour                          = 00,   00,
 start_minute                        = 00,   00,
 start_second                        = 00,   00,
 end_year                            = 2015, 2015,
 end_month                           = 09,   09,
 end_day                             = 07,   07, 
 end_hour                            = 18,   18,
 end_minute                          = 00,   00,
 end_second                          = 00,   00, 
 interval_seconds                    = 21600
 input_from_file                     = .true.,.true.,
 history_interval                    = 60, 60, 
 frames_per_outfile                  = 24, 24,
 restart                             = .false.,
 restart_interval                    = 5000,
 io_form_history                     = 2
 io_form_restart                     = 2
 io_form_input                       = 2
 io_form_boundary                    = 2
 debug_level                         = 0
 /

 &domains
 time_step                           = 1,
 time_step_fract_num                 = 0,
 time_step_fract_den                 = 1,
 max_dom                             = 2,
 e_we                                = 160,    113,
 e_sn                                = 160,    113, 
 e_vert                              = 25,     25,   
 p_top_requested                     = 10000,
 num_metgrid_levels                  = 27,
 num_metgrid_soil_levels             = 4,
 dx                                  = 4000, 1000, 
 dy                                  = 4000, 1000, 
 grid_id                             = 1,     2,   
 parent_id                           = 1,     1, 
 i_parent_start                      = 1,     67,
 j_parent_start                      = 1,     67, 
 parent_grid_ratio                   = 1,     4, 
 parent_time_step_ratio              = 1,     1, 
 feedback                            = 1,
 smooth_option                       = 0
 /

 &physics
 physics_suite                       = 'CONUS'   !!! overwritten below
 radt                                = 1,    1,
 bldt                                = 0,     0,   
 cudt                                = 5,     5, 
 icloud                              = 1,
 num_soil_layers                     = 4,
 num_land_cat                        = 24,
!!! CONUS overwritten with !!!
 sf_urban_physics                    = 0,     0, 
 mp_physics                          = 8,     8,
 ra_lw_physics                       = 4,     4,
 ra_sw_physics                       = 4,     4,
 sf_sfclay_physics                   = 2,     2,
 sf_surface_physics                  = 2,     2,
 bl_pbl_physics                      = 2,     2,
 topo_wind                           = 2,     2,
 cu_physics                          = 1,     1,
 isfflx                              = 0,
 ifsnow                              = 0,
 surface_input_source                = 1,
/

 &fdda
 /

 &dynamics
 w_damping                           = 0,
 diff_opt                            = 2,      2,     
 km_opt                              = 4,      4,   
 diff_6th_opt                        = 2,      2,   
 diff_6th_factor                     = 0.12,   0.12,
 base_temp                           = 290.
 damp_opt                            = 3,
 zdamp                               = 5000.,  5000.,
 dampcoef                            = 0.2,    0.2, 
 khdif                               = 0,      0, 
 kvdif                               = 0,      0,
 non_hydrostatic                     = .true., .true.,
 moist_adv_opt                       = 1,      1, 
 scalar_adv_opt                      = 1,      1, 
 gwd_opt                             = 0,
 /

 &bdy_control
 spec_bdy_width                      = 5,
 spec_zone                           = 1,
 relax_zone                          = 4,
 specified                           = .true., .false.,
 nested                              = .false., .true.,
 /

 &grib2
 /

 &namelist_quilt
 nio_tasks_per_group = 0,
 nio_groups = 1,


I have tried:
--Various time_step = 1, 6, 24, all have produced the same segmentation fault.
--Various nprocs = 24 to 144, same issue.
-- "ulimit -s unlimited" before running wrf.exe, same issue.

Some questions:
-- Does anything in the backtrace hint at the issue?
-- Are any of my namelist options incompatible with each other, or with my 4km and 1km grids?
-- I have set the time_step = 1, which is less than 6*DX of either of the domains. I have read that the parent_time_step_ratio is commonly the same for each domain, though it can be different. Is there any general method to choice this ratio?

I have previously ran a very similar setup (same domain, grid, input data, but different physics options) with WRFv3.7, and it was successful. But this was quite a while ago and since then there have been some updates to the cluster I work on.

Does anyone have any insights?
Thanks.
brro8458
 
Posts: 4
Joined: Thu Apr 19, 2018 7:57 pm

Re: Segmentation Fault WRFv3.9

Postby brro8458 » Mon May 07, 2018 5:34 pm

I've seen some replies to similar issues recently, so I'm just wondering if anyone has any thoughts on this?

Thanks.
brro8458
 
Posts: 4
Joined: Thu Apr 19, 2018 7:57 pm

Re: Segmentation Fault WRFv3.9

Postby petroui » Wed May 09, 2018 4:38 am

Dear brro8458,

I had the same problem some days ago, as i posted here in forum (viewtopic.php?f=6&t=10679).
I tried some standar solutions as you (reduce time step, ulimit -s unlimited etc) with any result.
In my case, i noticed that maybe there is a problem with input data . I used to work with data from this dataset (https://rda.ucar.edu/datasets/ds083.2/).
Now i am running WRF with ECMWF ERA-Interim Dataset (https://rda.ucar.edu/datasets/ds627.0/#!description) and i dont have any segmetation fault for the same dates.
Maybe you can test to run the model with different input data for the same dates.
I hope this helps.
Best regards,
petroui
petroui
 
Posts: 11
Joined: Mon Mar 27, 2017 5:54 am


Return to Runtime Problems

Who is online

Users browsing this forum: No registered users and 8 guests

cron