Folder for input files. Details of the elevation file dataset and node-polygon file dataset extracted from high resolution LiDAR data. The input data is organized into a set of files that contain data extracted from lidar data. The data set is canonized by Chandana Gangodagamage and organized into a set of input files. These input files include ground surface elevations, polygon shape coordinates, and widths for rims and troughs. Thank you to Craig Tweedie, University of Texas, El Paso, for providing the Lidar data for use in these projects. Data from Chandana is post processed into formats readable for use in script mesh_adjoin.py This directory now contains the pre and post processed files for use in multiple mesh creations. Original work was run in polys_intermediate using mesh_adjoin_var.py Once these files were written, the sections generating them were removed from scripts. Starting input files are poly100##.avs, lidar_site3.inp, and params.csv Post-processed for intermediate polys are lidar_site3_surf.inp and param_##.txt INPUT lidar_site3.inp OUTPUT lidar_site3_surf.inp (connect into tri mesh, elevations saved in z_save OUTPUT lidar_site3_surf_trans.inp (trans 585800. 7910200.) lagrit < process_lidar.lgi > process_lidar.out.txt INPUT params.csv (from polygon_areaC_trough_halfwidths_rims_final72.csv) OUTPUT param_##.txt (offset params for each poly giving trough and rim widths) Script to write variable params.txt for each polygon based on polygon_areaC_trough_halfwidths_rims_final72.csv renamed to params.csv def read_csv_params(file_, separator = ","): '''Write several params files if the paramiters are non uniform ''' global cwd input_folder = "input" #parameters file content #params = ["0.05", "2", "-1.416", "-3.069"] #params files naming convention #poly10001_params.txt input_file = cwd + "/" + input_folder + "/" + file_ first_line = 1 # Check if the csv file in the first line, which is the header csvfile = open(input_file, 'rb') csv_values = csv.reader(csvfile, delimiter = separator) for row in csv_values: if not first_line: params = ["0.05", "2", str(float(row[1])*(-1)), str(float(row[2])*(-1))] if int(row[0]) < 10: suffix = "0" + str(row[0]) else: suffix = str(row[0]) poly_param_file = "param_" + suffix + ".txt" write_params(poly_param_file, params) else: first_line=0 Script to use variable param*txt to generate each set of offsets: def offset_polys(polys, is_uniform = 1): '''Offsets polygons in the polys list using params.txt file ''' #sort points counterclockwise in each file #and save it in filename_rev.inp params_file = " params.txt" for fname in polys: file_name = reverse_sorting(fname) # offset_polygon outputs 2 files per each polygon: # 1. poly#_sim1.inp (outer polygon) # 2. poly#_sim2.inp (inner polygon) if not is_uniform: params_file = " param_" + file_name[7:9] + ".txt" print(params_file) os.system("../offset_polygon " + file_name + params_file)