Changes in / [5c84add:601b93d] in sasview
- Files:
-
- 26 added
- 9 deleted
- 17 edited
Legend:
- Unmodified
- Added
- Removed
-
.travis.yml
r58918de r937529e 12 12 system_site_packages: true 13 13 before_install: 14 - 'if [ $TRAVIS_PYTHON_VERSION == "2.7" ]; then sudo apt-get update;sudo apt-get install python-numpy python-scipy python-matplotlib libhdf5-serial-dev python-h5py fglrx opencl-headers python-pyopencl; fi'14 - 'if [ $TRAVIS_PYTHON_VERSION == "2.7" ]; then sudo apt-get update;sudo apt-get install python-numpy python-scipy python-matplotlib libhdf5-serial-dev python-h5py fglrx opencl-headers; fi' 15 15 16 16 install: 17 17 - pip install -r build_tools/requirements.txt 18 - pip install pyopencl 18 19 19 20 before_script: … … 30 31 - ls -ltr 31 32 - if [ ! -d "utils" ]; then mkdir utils; fi 32 - /bin/sh -xe sasview/build_tools/ travis_build.sh33 - /bin/sh -xe sasview/build_tools/jenkins_linux_build.sh 33 34 # - /bin/sh -xe sasview/build_tools/jenkins_linux_test.sh 34 35 - export LC_ALL=en_US.UTF-8 -
docs/sphinx-docs/source/user/sasgui/perspectives/invariant/invariant_help.rst
r9bbc074 r5f5c596 4 4 .. by S King, ISIS, during SasView CodeCamp-III in Feb 2015. 5 5 6 Invariant Calculation 7 ===================== 6 Invariant Calculation Perspective 7 ================================= 8 8 9 9 Description … … 19 19 .. image:: image001.gif 20 20 21 where *g = q* for pinhole geometry (SAS) and *g = q*\ :sub:`v`(the slit height) for21 where *g = Q* for pinhole geometry (SAS) and *g = Qv* (the slit height) for 22 22 slit geometry (USAS). 23 23 … … 45 45 .. ZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZ 46 46 47 Using invariant analysis48 --------------------- ---47 Using the perspective 48 --------------------- 49 49 50 50 1) Select *Invariant* from the *Analysis* menu on the SasView toolbar. … … 53 53 54 54 3) Select a dataset and use the *Send To* button on the *Data Explorer* to load 55 the dataset into the *Invariant* p anel.55 the dataset into the *Invariant* perspective. 56 56 57 4) Use the *Customised Input* boxes on the *Invariant* p anelto subtract57 4) Use the *Customised Input* boxes on the *Invariant* perspective to subtract 58 58 any background, specify the contrast (i.e. difference in SLDs - this must be 59 59 specified for the eventual value of Q*\ to be on an absolute scale), or to … … 73 73 74 74 8) If the value of Q*\ calculated with the extrapolated regions is invalid, a 75 red warning will appear at the top of the *Invariant* p anel.75 red warning will appear at the top of the *Invariant* perspective panel. 76 76 77 77 The details of the calculation are available by clicking the *Details* -
docs/sphinx-docs/source/user/user.rst
r20a3c55 r5a71761 14 14 15 15 Working with SasView <working> 16 17 Computations with GPU <gpu_computations> -
sasview/local_config.py
r9bbc074 rd85c194 82 82 _corner_image = os.path.join(icon_path, "angles_flat.png") 83 83 _welcome_image = os.path.join(icon_path, "SVwelcome.png") 84 _copyright = "(c) 2009 - 201 6, UTK, UMD, NIST, ORNL, ISIS, ESS and ILL"84 _copyright = "(c) 2009 - 2013, UTK, UMD, NIST, ORNL, ISIS, ESS and ILL" 85 85 86 86 -
sasview/setup_exe.py
r9bbc074 r525aaa2 165 165 self.version = local_config.__version__ 166 166 self.company_name = "SasView.org" 167 self.copyright = "copyright 2009 - 201 6"167 self.copyright = "copyright 2009 - 2013" 168 168 self.name = "SasView" 169 169 -
src/sas/sascalc/dataloader/readers/ascii_reader.py
r7d94915 rb699768 33 33 ## File type 34 34 type_name = "ASCII" 35 35 36 36 ## Wildcards 37 37 type = ["ASCII files (*.txt)|*.txt", … … 41 41 ## List of allowed extensions 42 42 ext = ['.txt', '.TXT', '.dat', '.DAT', '.abs', '.ABS', 'csv', 'CSV'] 43 43 44 44 ## Flag to bypass extension check 45 45 allow_all = True 46 46 47 47 def read(self, path): 48 48 """ 49 49 Load data file 50 50 51 51 :param path: file path 52 52 53 :return: Data1D object, or None 53 54 54 55 :raise RuntimeError: when the file can't be opened 55 56 :raise ValueError: when the length of the data vectors are inconsistent … … 61 62 try: 62 63 # Read in binary mode since GRASP frequently has no-ascii 63 # characters that br eaks the open operation64 # characters that brakes the open operation 64 65 input_f = open(path,'rb') 65 66 except: … … 67 68 buff = input_f.read() 68 69 lines = buff.splitlines() 69 70 # Arrays for data storage 71 tx = numpy.zeros(0) 72 ty = numpy.zeros(0) 70 71 x = numpy.zeros(0) 72 y = numpy.zeros(0) 73 dy = numpy.zeros(0) 74 dx = numpy.zeros(0) 75 76 #temp. space to sort data 77 tx = numpy.zeros(0) 78 ty = numpy.zeros(0) 73 79 tdy = numpy.zeros(0) 74 80 tdx = numpy.zeros(0) 75 81 82 output = Data1D(x, y, dy=dy, dx=dx) 83 self.filename = output.filename = basename 84 85 data_conv_q = None 86 data_conv_i = None 87 88 if has_converter == True and output.x_unit != '1/A': 89 data_conv_q = Converter('1/A') 90 # Test it 91 data_conv_q(1.0, output.x_unit) 92 93 if has_converter == True and output.y_unit != '1/cm': 94 data_conv_i = Converter('1/cm') 95 # Test it 96 data_conv_i(1.0, output.y_unit) 97 76 98 # The first good line of data will define whether 77 99 # we have 2-column or 3-column ascii 78 100 has_error_dx = None 79 101 has_error_dy = None 80 102 81 103 #Initialize counters for data lines and header lines. 82 is_data = False 104 is_data = False # Has more than 5 lines 83 105 # More than "5" lines of data is considered as actual 84 106 # data unless that is the only data 85 m in_data_pts = 5107 mum_data_lines = 5 86 108 # To count # of current data candidate lines 87 candidate_lines = 0109 i = -1 88 110 # To count total # of previous data candidate lines 89 candidate_lines_previous = 0 90 #minimum required number of columns of data 111 i1 = -1 112 # To count # of header lines 113 j = -1 114 # Helps to count # of header lines 115 j1 = -1 116 #minimum required number of columns of data; ( <= 4). 91 117 lentoks = 2 92 118 for line in lines: 93 toks = self.splitline(line) 94 # To remember the # of columns in the current line of data 95 new_lentoks = len(toks) 119 # Initial try for CSV (split on ,) 120 toks = line.split(',') 121 # Now try SCSV (split on ;) 122 if len(toks) < 2: 123 toks = line.split(';') 124 # Now go for whitespace 125 if len(toks) < 2: 126 toks = line.split() 96 127 try: 97 if new_lentoks == 1 and not is_data:98 ## If only one item in list, no longer data99 raise ValueError100 elif new_lentoks == 0:101 ## If the line is blank, skip and continue on102 ## In case of breaks within data sets.103 continue104 elif new_lentoks != lentoks and is_data:105 ## If a footer is found, break the loop and save the data106 break107 elif new_lentoks != lentoks and not is_data:108 ## If header lines are numerical109 candidate_lines = 0110 candidate_lines_previous = 0111 112 128 #Make sure that all columns are numbers. 113 129 for colnum in range(len(toks)): 114 # Any non-floating point values throw ValueError115 130 float(toks[colnum]) 116 117 candidate_lines += 1 131 118 132 _x = float(toks[0]) 119 133 _y = float(toks[1]) 134 135 #Reset the header line counters 136 if j == j1: 137 j = 0 138 j1 = 0 139 140 if i > 1: 141 is_data = True 142 143 if data_conv_q is not None: 144 _x = data_conv_q(_x, units=output.x_unit) 145 146 if data_conv_i is not None: 147 _y = data_conv_i(_y, units=output.y_unit) 148 149 # If we have an extra token, check 150 # whether it can be interpreted as a 151 # third column. 152 _dy = None 153 if len(toks) > 2: 154 try: 155 _dy = float(toks[2]) 156 157 if data_conv_i is not None: 158 _dy = data_conv_i(_dy, units=output.y_unit) 159 160 except: 161 # The third column is not a float, skip it. 162 pass 163 164 # If we haven't set the 3rd column 165 # flag, set it now. 166 if has_error_dy == None: 167 has_error_dy = False if _dy == None else True 168 169 #Check for dx 120 170 _dx = None 121 _dy = None 122 123 #If 5 or more lines, this is considering the set data 124 if candidate_lines >= min_data_pts: 125 is_data = True 126 127 # If a 3rd row is present, consider it dy 128 if new_lentoks > 2: 129 _dy = float(toks[2]) 130 has_error_dy = False if _dy == None else True 131 132 # If a 4th row is present, consider it dx 133 if new_lentoks > 3: 134 _dx = float(toks[3]) 135 has_error_dx = False if _dx == None else True 136 137 # Delete the previously stored lines of data candidates if 138 # the list is not data 139 if candidate_lines == 1 and -1 < candidate_lines_previous < min_data_pts and \ 171 if len(toks) > 3: 172 try: 173 _dx = float(toks[3]) 174 175 if data_conv_i is not None: 176 _dx = data_conv_i(_dx, units=output.x_unit) 177 178 except: 179 # The 4th column is not a float, skip it. 180 pass 181 182 # If we haven't set the 3rd column 183 # flag, set it now. 184 if has_error_dx == None: 185 has_error_dx = False if _dx == None else True 186 187 #After talked with PB, we decided to take care of only 188 # 4 columns of data for now. 189 #number of columns in the current line 190 #To remember the # of columns in the current 191 #line of data 192 new_lentoks = len(toks) 193 194 #If the previous columns not equal to the current, 195 #mark the previous as non-data and reset the dependents. 196 if lentoks != new_lentoks: 197 if is_data == True: 198 break 199 else: 200 i = -1 201 i1 = 0 202 j = -1 203 j1 = -1 204 205 #Delete the previously stored lines of data candidates 206 # if is not data. 207 if i < 0 and -1 < i1 < mum_data_lines and \ 208 is_data == False: 209 try: 210 x = numpy.zeros(0) 211 y = numpy.zeros(0) 212 except: 213 pass 214 215 x = numpy.append(x, _x) 216 y = numpy.append(y, _y) 217 218 if has_error_dy == True: 219 #Delete the previously stored lines of 220 # data candidates if is not data. 221 if i < 0 and -1 < i1 < mum_data_lines and \ 222 is_data == False: 223 try: 224 dy = numpy.zeros(0) 225 except: 226 pass 227 dy = numpy.append(dy, _dy) 228 229 if has_error_dx == True: 230 #Delete the previously stored lines of 231 # data candidates if is not data. 232 if i < 0 and -1 < i1 < mum_data_lines and \ 233 is_data == False: 234 try: 235 dx = numpy.zeros(0) 236 except: 237 pass 238 dx = numpy.append(dx, _dx) 239 240 #Same for temp. 241 #Delete the previously stored lines of data candidates 242 # if is not data. 243 if i < 0 and -1 < i1 < mum_data_lines and\ 140 244 is_data == False: 141 245 try: 142 246 tx = numpy.zeros(0) 143 247 ty = numpy.zeros(0) 144 tdy = numpy.zeros(0)145 tdx = numpy.zeros(0)146 248 except: 147 249 pass 148 250 251 tx = numpy.append(tx, _x) 252 ty = numpy.append(ty, _y) 253 149 254 if has_error_dy == True: 255 #Delete the previously stored lines of 256 # data candidates if is not data. 257 if i < 0 and -1 < i1 < mum_data_lines and \ 258 is_data == False: 259 try: 260 tdy = numpy.zeros(0) 261 except: 262 pass 150 263 tdy = numpy.append(tdy, _dy) 151 264 if has_error_dx == True: 265 #Delete the previously stored lines of 266 # data candidates if is not data. 267 if i < 0 and -1 < i1 < mum_data_lines and \ 268 is_data == False: 269 try: 270 tdx = numpy.zeros(0) 271 except: 272 pass 152 273 tdx = numpy.append(tdx, _dx) 153 tx = numpy.append(tx, _x) 154 ty = numpy.append(ty, _y) 155 274 275 #reset i1 and flag lentoks for the next 276 if lentoks < new_lentoks: 277 if is_data == False: 278 i1 = -1 156 279 #To remember the # of columns on the current line 157 280 # for the next line of data 158 lentoks = new_lentoks 159 candidate_lines_previous = candidate_lines 160 except ValueError: 281 lentoks = len(toks) 282 283 #Reset # of header lines and counts # 284 # of data candidate lines 285 if j == 0 and j1 == 0: 286 i1 = i + 1 287 i += 1 288 except: 161 289 # It is data and meet non - number, then stop reading 162 290 if is_data == True: 163 291 break 164 292 lentoks = 2 165 has_error_dx = None 166 has_error_dy = None 293 #Counting # of header lines 294 j += 1 295 if j == j1 + 1: 296 j1 = j 297 else: 298 j = -1 167 299 #Reset # of lines of data candidates 168 candidate_lines = 0 169 except: 300 i = -1 301 302 # Couldn't parse this line, skip it 170 303 pass 171 304 172 305 input_f.close() 173 if not is_data:174 return None175 306 # Sanity check 176 if has_error_dy == True and not len( ty) == len(tdy):307 if has_error_dy == True and not len(y) == len(dy): 177 308 msg = "ascii_reader: y and dy have different length" 178 309 raise RuntimeError, msg 179 if has_error_dx == True and not len( tx) == len(tdx):310 if has_error_dx == True and not len(x) == len(dx): 180 311 msg = "ascii_reader: y and dy have different length" 181 312 raise RuntimeError, msg 182 313 # If the data length is zero, consider this as 183 314 # though we were not able to read the file. 184 if len( tx) == 0:315 if len(x) == 0: 185 316 raise RuntimeError, "ascii_reader: could not load file" 186 317 187 318 #Let's re-order the data to make cal. 188 319 # curve look better some cases 189 320 ind = numpy.lexsort((ty, tx)) 190 x = numpy.zeros(len(tx))191 y = numpy.zeros(len(ty))192 dy = numpy.zeros(len(tdy))193 dx = numpy.zeros(len(tdx))194 output = Data1D(x, y, dy=dy, dx=dx)195 self.filename = output.filename = basename196 197 321 for i in ind: 198 322 x[i] = tx[ind[i]] … … 214 338 output.dx = dx[x != 0] if has_error_dx == True\ 215 339 else numpy.zeros(len(output.x)) 216 217 output.xaxis("\\rm{Q}", 'A^{-1}') 218 output.yaxis("\\rm{Intensity}", "cm^{-1}") 219 340 341 if data_conv_q is not None: 342 output.xaxis("\\rm{Q}", output.x_unit) 343 else: 344 output.xaxis("\\rm{Q}", 'A^{-1}') 345 if data_conv_i is not None: 346 output.yaxis("\\rm{Intensity}", output.y_unit) 347 else: 348 output.yaxis("\\rm{Intensity}", "cm^{-1}") 349 220 350 # Store loading process information 221 351 output.meta_data['loader'] = self.type_name … … 223 353 raise RuntimeError, "%s is empty" % path 224 354 return output 225 355 226 356 else: 227 357 raise RuntimeError, "%s is not a file" % path 228 358 return None 229 230 def splitline(self, line):231 """232 Splits a line into pieces based on common delimeters233 :param line: A single line of text234 :return: list of values235 """236 # Initial try for CSV (split on ,)237 toks = line.split(',')238 # Now try SCSV (split on ;)239 if len(toks) < 2:240 toks = line.split(';')241 # Now go for whitespace242 if len(toks) < 2:243 toks = line.split()244 return toks -
src/sas/sasgui/guiframe/media/graph_help.rst
rf9b0c81 re68c9bf 42 42 plot window. 43 43 44 .. note:: 45 *If a residuals graph (when fitting data) is hidden, it will not show up 46 after computation.* 44 *NOTE! If a residuals graph (when fitting data) is hidden, it will not show up 45 after computation.* 47 46 48 47 Dragging a plot … … 68 67 After zooming in on a a region, the *left arrow* or *right arrow* buttons on 69 68 the toolbar will switch between recent views. 70 71 The axis range can also be specified manually. To do so go to the *Graph Menu*72 (see Invoking_the_graph_menu_ for further details), choose the *Set Graph Range*73 option and enter the limits in the pop box.74 69 75 70 *NOTE! If a wheel mouse is available scrolling the wheel will zoom in/out … … 121 116 ^^^^^^^^^^^^^^^^^^^ 122 117 123 It is possible to make custom modifications to plots including: 118 From the *Graph Menu* (see Invoking_the_graph_menu_) it is also possible to 119 make some custom modifications to plots, including: 124 120 125 121 * changing the plot window title 126 * changing the default legend location and toggling it on/off127 * changing the axis l abel text128 * changing the axis l abel units129 * changing the axis l abel font & font colour122 * changing the axis legend locations 123 * changing the axis legend label text 124 * changing the axis legend label units 125 * changing the axis legend label font & font colour 130 126 * adding/removing a text string 131 127 * adding a grid overlay 132 133 The legend and text strings can be drag and dropped around the plot134 135 These options are accessed through the *Graph Menu* (see Invoking_the_graph_menu_)136 and selecting *Modify Graph Appearance* (for axis labels, grid overlay and137 legend position) or *Add Text* to add textual annotations, selecting font, color,138 style and size. *Remove Text* will remove the last annotation added. To change139 the legend. *Window Title* allows a custom title to be entered instead of Graph140 x.141 128 142 129 Changing scales … … 247 234 selected data will be removed from the plot. 248 235 249 .. note:: 250 The Remove data set action cannot be undone. 236 *NOTE! This action cannot be undone.* 251 237 252 238 Show-Hide error bars … … 262 248 In the *Dataset Menu* (see Invoking_the_dataset_menu_), select *Modify Plot 263 249 Property* to change the size, color, or shape of the displayed marker for the 264 chosen dataset, or to change the dataset label that appears in the plot legend 265 box. 250 chosen dataset, or to change the dataset label that appears on the plot. 266 251 267 252 .. ZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZ … … 307 292 average. 308 293 309 .. note:: 310 The displayed average only updates when input focus is moved back to 311 that window; ie, when the mouse pointer is moved onto that plot. 294 *NOTE! The displayed average only updates when input focus is moved back to 295 that window; ie, when the mouse pointer is moved onto that plot.* 312 296 313 297 Selecting *Box Sum* automatically brings up the 'Slicer Parameters' dialog in … … 375 359 .. ZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZ 376 360 377 .. note:: This help document was last modified by Paul Butler, 05 September, 2016361 .. note:: This help document was last changed by Steve King, 01May2015 -
src/sas/sasgui/guiframe/utils.py
ra0373d5 rd85c194 46 46 return flag 47 47 48 49 def check_int(item): 50 """ 51 :param item: txtcrtl containing a value 52 """ 53 flag = True 54 try: 55 mini = int(item.GetValue()) 56 item.SetBackgroundColour(wx.WHITE) 57 item.Refresh() 58 except: 59 flag = False 60 item.SetBackgroundColour("pink") 61 item.Refresh() 62 return flag 63 64 48 65 49 class PanelMenu(wx.Menu): 66 50 """ -
src/sas/sasgui/perspectives/fitting/basepage.py
r6c382da ree4b3cb 17 17 from wx.lib.scrolledpanel import ScrolledPanel 18 18 19 from sasmodels.weights import MODELS as POLYDISPERSITY_MODELS 20 19 import sasmodels.sasview_model 21 20 from sas.sasgui.guiframe.panel_base import PanelBase 22 from sas.sasgui.guiframe.utils import format_number, check_float, IdList , check_int21 from sas.sasgui.guiframe.utils import format_number, check_float, IdList 23 22 from sas.sasgui.guiframe.events import PanelOnFocusEvent 24 23 from sas.sasgui.guiframe.events import StatusEvent … … 627 626 self.disp_help_bt.Bind(wx.EVT_BUTTON, self.on_pd_help_clicked, 628 627 id=self.disp_help_bt.GetId()) 629 self.disp_help_bt.SetToolTipString("Help for polydispersion.")628 self.disp_help_bt.SetToolTipString("Helps for Polydispersion.") 630 629 631 630 self.Bind(wx.EVT_RADIOBUTTON, self._set_dipers_Param, … … 933 932 if len(self._disp_obj_dict) > 0: 934 933 for k, v in self._disp_obj_dict.iteritems(): 935 self.state._disp_obj_dict[k] = v .type934 self.state._disp_obj_dict[k] = v 936 935 937 936 self.state.values = copy.deepcopy(self.values) … … 1010 1009 if len(self._disp_obj_dict) > 0: 1011 1010 for k, v in self._disp_obj_dict.iteritems(): 1012 self.state._disp_obj_dict[k] = v .type1011 self.state._disp_obj_dict[k] = v 1013 1012 1014 1013 self.state.values = copy.deepcopy(self.values) … … 1124 1123 state.disp_cb_dict[item]) 1125 1124 # Create the dispersion objects 1126 disp_model = POLYDISPERSITY_MODELS['array']() 1125 from sas.models.dispersion_models import ArrayDispersion 1126 disp_model = ArrayDispersion() 1127 1127 if hasattr(state, "values") and \ 1128 1128 self.disp_cb_dict[item].GetValue() == True: … … 1379 1379 self.weights = copy.deepcopy(state.weights) 1380 1380 1381 for key, disp_type in state._disp_obj_dict.iteritems(): 1382 #disp_model = disp 1383 disp_model = POLYDISPERSITY_MODELS[disp_type]() 1381 for key, disp in state._disp_obj_dict.iteritems(): 1382 # From saved file, disp_model can not be sent in model obj. 1383 # it will be sent as a string here, then converted to model object. 1384 if disp.__class__.__name__ == 'str': 1385 disp_model = None 1386 com_str = "from sasmodels.weights " 1387 com_str += "import %s as disp_func \ndisp_model = disp_func()" 1388 exec com_str % disp 1389 else: 1390 disp_model = disp 1384 1391 self._disp_obj_dict[key] = disp_model 1385 1392 param_name = key.split('.')[0] … … 2274 2281 continue 2275 2282 2276 value_ctrl = item[2] 2277 if not value_ctrl.IsEnabled(): 2278 # ArrayDispersion disables PD, Min, Max, Npts, Nsigs 2283 name = str(item[1]) 2284 if name.endswith(".npts") or name.endswith(".nsigmas"): 2279 2285 continue 2280 2286 2281 name = item[1] 2287 # Check that min, max and value are floats 2288 value_ctrl, min_ctrl, max_ctrl = item[2], item[5], item[6] 2289 min_str = min_ctrl.GetValue().strip() 2290 max_str = max_ctrl.GetValue().strip() 2282 2291 value_str = value_ctrl.GetValue().strip() 2283 if name.endswith(".npts"): 2284 validity = check_int(value_ctrl) 2285 if not validity: 2286 continue 2287 value = int(value_str) 2288 2289 elif name.endswith(".nsigmas"): 2290 validity = check_float(value_ctrl) 2291 if not validity: 2292 continue 2293 value = float(value_str) 2294 2295 else: # value or polydispersity 2296 2297 # Check that min, max and value are floats 2298 min_ctrl, max_ctrl = item[5], item[6] 2299 min_str = min_ctrl.GetValue().strip() 2300 max_str = max_ctrl.GetValue().strip() 2301 validity = check_float(value_ctrl) 2302 if min_str != "": 2303 validity = validity and check_float(min_ctrl) 2304 if max_str != "": 2305 validity = validity and check_float(max_ctrl) 2306 if not validity: 2307 continue 2308 2309 # Check that min is less than max 2310 low = -numpy.inf if min_str == "" else float(min_str) 2311 high = numpy.inf if max_str == "" else float(max_str) 2312 if high < low: 2313 min_ctrl.SetBackgroundColour("pink") 2314 min_ctrl.Refresh() 2315 max_ctrl.SetBackgroundColour("pink") 2316 max_ctrl.Refresh() 2317 #msg = "Invalid fit range for %s: min must be smaller than max"%name 2318 #wx.PostEvent(self._manager.parent, StatusEvent(status=msg)) 2319 continue 2320 2321 # Force value between min and max 2322 value = float(value_str) 2323 if value < low: 2324 value = low 2325 value_ctrl.SetValue(format_number(value)) 2326 elif value > high: 2327 value = high 2328 value_ctrl.SetValue(format_number(value)) 2329 2330 if name not in self.model.details.keys(): 2331 self.model.details[name] = ["", None, None] 2332 old_low, old_high = self.model.details[name][1:3] 2333 if old_low != low or old_high != high: 2334 # The configuration has changed but it won't change the 2335 # computed curve so no need to set is_modified to True 2336 #is_modified = True 2337 self.model.details[name][1:3] = low, high 2292 validity = check_float(value_ctrl) 2293 if min_str != "": 2294 validity = validity and check_float(min_ctrl) 2295 if max_str != "": 2296 validity = validity and check_float(max_ctrl) 2297 if not validity: 2298 continue 2299 2300 # Check that min is less than max 2301 low = -numpy.inf if min_str == "" else float(min_str) 2302 high = numpy.inf if max_str == "" else float(max_str) 2303 if high < low: 2304 min_ctrl.SetBackgroundColour("pink") 2305 min_ctrl.Refresh() 2306 max_ctrl.SetBackgroundColour("pink") 2307 max_ctrl.Refresh() 2308 #msg = "Invalid fit range for %s: min must be smaller than max"%name 2309 #wx.PostEvent(self._manager.parent, StatusEvent(status=msg)) 2310 continue 2311 2312 # Force value between min and max 2313 value = float(value_str) 2314 if value < low: 2315 value = low 2316 value_ctrl.SetValue(format_number(value)) 2317 elif value > high: 2318 value = high 2319 value_ctrl.SetValue(format_number(value)) 2338 2320 2339 2321 # Update value in model if it has changed … … 2341 2323 self.model.setParam(name, value) 2342 2324 is_modified = True 2325 2326 if name not in self.model.details.keys(): 2327 self.model.details[name] = ["", None, None] 2328 old_low, old_high = self.model.details[name][1:3] 2329 if old_low != low or old_high != high: 2330 # The configuration has changed but it won't change the 2331 # computed curve so no need to set is_modified to True 2332 #is_modified = True 2333 self.model.details[name][1:3] = low, high 2343 2334 2344 2335 return is_modified … … 2513 2504 self._disp_obj_dict[name1] = disp_model 2514 2505 self.model.set_dispersion(param_name, disp_model) 2515 self.state._disp_obj_dict[name1] = disp_model .type2506 self.state._disp_obj_dict[name1] = disp_model 2516 2507 2517 2508 value1 = str(format_number(self.model.getParam(name1), True)) … … 2536 2527 item[0].Enable() 2537 2528 item[2].Enable() 2538 item[3].Show(True)2539 item[4].Show(True)2540 2529 item[5].Enable() 2541 2530 item[6].Enable() … … 2630 2619 self._disp_obj_dict[name] = disp 2631 2620 self.model.set_dispersion(name.split('.')[0], disp) 2632 self.state._disp_obj_dict[name] = disp .type2621 self.state._disp_obj_dict[name] = disp 2633 2622 self.values[name] = values 2634 2623 self.weights[name] = weights … … 2698 2687 :param disp_function: dispersion distr. function 2699 2688 """ 2689 # List of the poly_model name in the combobox 2690 list = ["RectangleDispersion", "ArrayDispersion", 2691 "LogNormalDispersion", "GaussianDispersion", 2692 "SchulzDispersion"] 2693 2700 2694 # Find the selection 2701 if disp_func is not None: 2702 try: 2703 return POLYDISPERSITY_MODELS.values().index(disp_func.__class__) 2704 except ValueError: 2705 pass # Fall through to default class 2706 return POLYDISPERSITY_MODELS.keys().index('gaussian') 2695 try: 2696 selection = list.index(disp_func.__class__.__name__) 2697 return selection 2698 except: 2699 return 3 2707 2700 2708 2701 def on_reset_clicked(self, event): … … 3291 3284 pd = content[name][1] 3292 3285 if name.count('.') > 0: 3293 # If this is parameter.width, then pd may be a floating3294 # point value or it may be an array distribution.3295 # Nothing to do for parameter.npts or parameter.nsigmas.3296 3286 try: 3297 3287 float(pd) 3298 if name.endswith('.npts'): 3299 pd = int(pd) 3300 except Exception: 3288 except: 3301 3289 #continue 3302 3290 if not pd and pd != '': … … 3306 3294 # Only array func has pd == '' case. 3307 3295 item[2].Enable(False) 3308 else:3309 item[2].Enable(True)3310 3296 if item[2].__class__.__name__ == "ComboBox": 3311 3297 if content[name][1] in self.model.fun_list: … … 3334 3320 pd = value[0] 3335 3321 if name.count('.') > 0: 3336 # If this is parameter.width, then pd may be a floating3337 # point value or it may be an array distribution.3338 # Nothing to do for parameter.npts or parameter.nsigmas.3339 3322 try: 3340 3323 pd = float(pd) 3341 if name.endswith('.npts'):3342 pd = int(pd)3343 3324 except: 3344 3325 #continue … … 3349 3330 # Only array func has pd == '' case. 3350 3331 item[2].Enable(False) 3351 else:3352 item[2].Enable(True)3353 3332 if item[2].__class__.__name__ == "ComboBox": 3354 3333 if value[0] in self.model.fun_list: … … 3370 3349 Helps get paste for poly function 3371 3350 3372 *item* is the parameter name 3373 3374 *value* depends on which parameter is being processed, and whether it 3375 has array polydispersity. 3376 3377 For parameters without array polydispersity: 3378 3379 parameter => ['FLOAT', ''] 3380 parameter.width => ['FLOAT', 'DISTRIBUTION', ''] 3381 parameter.npts => ['FLOAT', ''] 3382 parameter.nsigmas => ['FLOAT', ''] 3383 3384 For parameters with array polydispersity: 3385 3386 parameter => ['FLOAT', ''] 3387 parameter.width => ['FILENAME', 'array', [x1, ...], [w1, ...]] 3388 parameter.npts => ['FLOAT', ''] 3389 parameter.nsigmas => ['FLOAT', ''] 3390 """ 3391 # Do nothing if not setting polydispersity 3392 if len(value[1]) == 0: 3393 return 3394 3395 try: 3396 name = item[7].Name 3397 param_name = name.split('.')[0] 3398 item[7].SetValue(value[1]) 3399 selection = item[7].GetCurrentSelection() 3400 dispersity = item[7].GetClientData(selection) 3401 disp_model = dispersity() 3402 3403 if value[1] == 'array': 3404 pd_vals = numpy.array(value[2]) 3405 pd_weights = numpy.array(value[3]) 3406 if len(pd_vals) == 0 or len(pd_vals) != len(pd_weights): 3407 msg = ("bad array distribution parameters for %s" 3408 % param_name) 3409 raise ValueError(msg) 3410 self._set_disp_cb(True, item=item) 3411 self._set_array_disp_model(name=name, 3412 disp=disp_model, 3413 values=pd_vals, 3414 weights=pd_weights) 3415 else: 3416 self._set_disp_cb(False, item=item) 3417 self._disp_obj_dict[name] = disp_model 3418 self.model.set_dispersion(param_name, disp_model) 3419 self.state._disp_obj_dict[name] = disp_model.type 3420 # TODO: It's not an array, why update values and weights? 3421 self.model._persistency_dict[param_name] = \ 3422 [self.values, self.weights] 3423 self.state.values = self.values 3424 self.state.weights = self.weights 3425 3426 except Exception: 3427 logging.error(traceback.format_exc()) 3428 print "Error in BasePage._paste_poly_help: %s" % \ 3429 sys.exc_info()[1] 3430 3431 def _set_disp_cb(self, isarray, item): 3351 :param item: Gui param items 3352 :param value: the values for parameter ctrols 3353 """ 3354 is_array = False 3355 if len(value[1]) > 0: 3356 # Only for dispersion func.s 3357 try: 3358 item[7].SetValue(value[1]) 3359 selection = item[7].GetCurrentSelection() 3360 name = item[7].Name 3361 param_name = name.split('.')[0] 3362 dispersity = item[7].GetClientData(selection) 3363 disp_model = dispersity() 3364 # Only for array disp 3365 try: 3366 pd_vals = numpy.array(value[2]) 3367 pd_weights = numpy.array(value[3]) 3368 if len(pd_vals) > 0 and len(pd_vals) > 0: 3369 if len(pd_vals) == len(pd_weights): 3370 self._set_disp_array_cb(item=item) 3371 self._set_array_disp_model(name=name, 3372 disp=disp_model, 3373 values=pd_vals, 3374 weights=pd_weights) 3375 is_array = True 3376 except Exception: 3377 logging.error(traceback.format_exc()) 3378 if not is_array: 3379 self._disp_obj_dict[name] = disp_model 3380 self.model.set_dispersion(name, 3381 disp_model) 3382 self.state._disp_obj_dict[name] = \ 3383 disp_model 3384 self.model.set_dispersion(param_name, disp_model) 3385 self.state.values = self.values 3386 self.state.weights = self.weights 3387 self.model._persistency_dict[param_name] = \ 3388 [self.state.values, 3389 self.state.weights] 3390 3391 except Exception: 3392 logging.error(traceback.format_exc()) 3393 print "Error in BasePage._paste_poly_help: %s" % \ 3394 sys.exc_info()[1] 3395 3396 def _set_disp_array_cb(self, item): 3432 3397 """ 3433 3398 Set cb for array disp 3434 3399 """ 3435 if isarray: 3436 item[0].SetValue(False) 3437 item[0].Enable(False) 3438 item[2].Enable(False) 3439 item[3].Show(False) 3440 item[4].Show(False) 3441 item[5].SetValue('') 3442 item[5].Enable(False) 3443 item[6].SetValue('') 3444 item[6].Enable(False) 3445 else: 3446 item[0].Enable() 3447 item[2].Enable() 3448 item[3].Show(True) 3449 item[4].Show(True) 3450 item[5].Enable() 3451 item[6].Enable() 3400 item[0].SetValue(False) 3401 item[0].Enable(False) 3402 item[2].Enable(False) 3403 item[3].Show(False) 3404 item[4].Show(False) 3405 item[5].SetValue('') 3406 item[5].Enable(False) 3407 item[6].SetValue('') 3408 item[6].Enable(False) 3452 3409 3453 3410 def update_pinhole_smear(self): -
src/sas/sasgui/perspectives/fitting/fitpage.py
r6c382da ree4b3cb 10 10 import math 11 11 import time 12 import traceback13 12 14 13 from sasmodels.weights import MODELS as POLYDISPERSITY_MODELS … … 2059 2058 msg = "Error: This model state has missing or outdated " 2060 2059 msg += "information.\n" 2061 msg += traceback.format_exc()2060 msg += "%s" % (sys.exc_value) 2062 2061 wx.PostEvent(self._manager.parent, 2063 2062 StatusEvent(status=msg, info="error")) -
src/sas/sasgui/perspectives/fitting/media/fitting.rst
r05829fb rd85c194 18 18 19 19 Information on the SasView Optimisers <optimizer.rst> 20 21 Writing a Plugin <plugin.rst> 20 -
src/sas/sasgui/perspectives/fitting/media/fitting_help.rst
r05829fb rb64b87c 132 132 * By :ref:`Writing_a_Plugin` 133 133 134 *NB: Because of the way these options are implemented, it is not possible for them* 135 *to use the polydispersity algorithms in SasView. Only models in the model library* 136 *can do this. At the time of writing (Release 3.1.0) work is in hand to make it* 137 *easier to add new models to the model library.* 138 134 139 .. ZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZ 135 140 … … 158 163 the :ref:`Advanced` option. 159 164 160 *NB: "Fit Parameters" has been split into two sections, those which can be161 polydisperse (shape and orientation parameters) and those which are not162 (scattering length densities, for example).*163 164 165 Sum|Multi(p1,p2) 165 166 ^^^^^^^^^^^^^^^^ … … 191 192 *Advanced Custom Model Editor*. 192 193 193 See :ref:`Writing_a_Plugin` for details on the plugin format. 194 195 *NB: Sum/Product models are still using the SasView 3.x model format. Unless 196 you are confident about what you are doing, it is recommended that you 197 only modify lines denoted with the ## <----- comments!* 194 *NB: Unless you are confident about what you are doing, it is recommended that you* 195 *only modify lines denoted with the ## <----- comments!* 198 196 199 197 When editing is complete, select *Run -> Compile* from the *Model Editor* menu bar. An … … 213 211 214 212 *NB: Custom models shipped with SasView cannot be removed in this way.* 213 214 .. ZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZ 215 216 .. _Writing_a_Plugin: 217 218 Writing a Plugin 219 ---------------- 220 221 Advanced users can write their own model in Python and save it to the the SasView 222 *plugin_models* folder 223 224 *C:\\Users\\[username]\\.sasview\\plugin_models* - (on Windows) 225 226 in .py format. The next time SasView is started it will compile the plugin and add 227 it to the list of *Customized Models*. 228 229 It is recommended that existing plugin models be used as templates. 215 230 216 231 .. ZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZ -
src/sas/sasgui/perspectives/fitting/pagestate.py
r6c382da r654e8e0 24 24 from xml.dom.minidom import parseString 25 25 from lxml import etree 26 27 import sasmodels.weights28 26 29 27 import sas.sascalc.dataloader … … 476 474 value = content[1] 477 475 except Exception: 478 msg = "Report string expected 'name: value' but got %r"%line 479 logging.error(msg) 476 logging.error(traceback.format_exc()) 480 477 if name.count("State created"): 481 478 repo_time = "" + value … … 519 516 title_name = HEADER % title 520 517 except Exception: 521 msg = "While parsing 'data: ...'\n" 522 logging.error(msg + traceback.format_exc()) 518 logging.error(traceback.format_exc()) 523 519 if name == "model name ": 524 520 try: … … 535 531 q_range = CENTRE % q_name 536 532 except Exception: 537 msg = "While parsing 'Plotting Range: ...'\n" 538 logging.error(msg + traceback.format_exc()) 533 logging.error(traceback.format_exc()) 539 534 paramval = "" 540 535 for lines in param_string.split(":"): … … 716 711 # For self.values ={ disp_param_name: [vals,...],...} 717 712 # and for self.weights ={ disp_param_name: [weights,...],...} 713 value_list = {} 718 714 for item in LIST_OF_MODEL_ATTRIBUTES: 719 715 element = newdoc.createElement(item[0]) … … 729 725 730 726 # Create doc for the dictionary of self._disp_obj_dic 731 for tagname, varname, tagtype in DISPERSION_LIST: 732 element = newdoc.createElement(tagname) 733 value_list = getattr(self, varname) 734 for key, value in value_list.iteritems(): 727 for item in DISPERSION_LIST: 728 element = newdoc.createElement(item[0]) 729 value_list = getattr(self, item[1]) 730 for key, val in value_list.iteritems(): 731 value = repr(val) 735 732 sub_element = newdoc.createElement(key) 736 733 sub_element.setAttribute('name', str(key)) … … 850 847 # Recover _disp_obj_dict from xml file 851 848 self._disp_obj_dict = {} 852 for tagname, varname, tagtype in DISPERSION_LIST: 853 node = get_content("ns:%s" % tagname, entry) 849 for item in DISPERSION_LIST: 850 # Get node 851 node = get_content("ns:%s" % item[0], entry) 854 852 for attr in node: 855 parameter = str(attr.get('name')) 856 value = attr.get('value') 857 if value.startswith("<"): 858 try: 859 # <path.to.NamedDistribution object/instance...> 860 cls_name = value[1:].split()[0].split('.')[-1] 861 cls = getattr(sasmodels.weights, cls_name) 862 value = cls.type 863 except Exception: 864 logging.error("unable to load distribution %r for %s" 865 % (value, parameter)) 866 continue 867 _disp_obj_dict = getattr(self, varname) 868 _disp_obj_dict[parameter] = value 853 name = str(attr.get('name')) 854 val = attr.get('value') 855 value = val.split(" instance")[0] 856 disp_name = value.split("<")[1] 857 try: 858 # Try to recover disp_model object from strings 859 com = "from sas.models.dispersion_models " 860 com += "import %s as disp" 861 com_name = disp_name.split(".")[3] 862 exec com % com_name 863 disp_model = disp() 864 attribute = getattr(self, item[1]) 865 attribute[name] = com_name 866 except Exception: 867 logging.error(traceback.format_exc()) 869 868 870 869 # get self.values and self.weights dic. if exists 871 for tagname, varnamein LIST_OF_MODEL_ATTRIBUTES:872 node = get_content("ns:%s" % tagname, entry)870 for item in LIST_OF_MODEL_ATTRIBUTES: 871 node = get_content("ns:%s" % item[0], entry) 873 872 dic = {} 874 873 value_list = [] 875 874 for par in node: 876 875 name = par.get('name') 877 values = par.text.split( )876 values = par.text.split('\n') 878 877 # Get lines only with numbers 879 878 for line in values: … … 883 882 except Exception: 884 883 # pass if line is empty (it happens) 885 msg = ("Error reading %r from %s %s\n" 886 % (line, tagname, name)) 887 logging.error(msg + traceback.format_exc()) 884 logging.error(traceback.format_exc()) 888 885 dic[name] = numpy.array(value_list) 889 setattr(self, varname, dic)886 setattr(self, item[1], dic) 890 887 891 888 def set_plot_state(self, figs, canvases): … … 1234 1231 1235 1232 except: 1236 logging.info("XML document does not contain fitting information.\n" 1237 + traceback.format_exc()) 1233 logging.info("XML document does not contain fitting information.\n %s" % sys.exc_value) 1238 1234 1239 1235 return state -
src/sas/sasgui/perspectives/pr/media/pr_help.rst
r0391dae rb64b87c 15 15 *P(r)* is set to be equal to an expansion of base functions of the type 16 16 17 .. math:: 18 \Phi_{n(r)} = 2 r sin(\frac{\pi n r}{D_{max}}) 17 |bigphi|\_n(r) = 2.r.sin(|pi|\ .n.r/D_max) 19 18 20 The coefficient of each base function in the expansion is found by performing 19 The coefficient of each base function in the expansion is found by performing 21 20 a least square fit with the following fit function 22 21 23 .. math:: 22 |chi|\ :sup:`2` = |bigsigma|\ :sub:`i` [ I\ :sub:`meas`\ (Q\ :sub:`i`\ ) - I\ :sub:`th`\ (Q\ :sub:`i`\ ) ] :sup:`2` / (Error) :sup:`2` + Reg_term 24 23 25 \chi^2=\frac{\sum_i (I_{meas}(Q_i)-I_{th}(Q_i))^2}{error^2}+Reg\_term 26 24 where I\ :sub:`meas`\ (Q) is the measured scattering intensity and 25 I\ :sub:`th`\ (Q) is the prediction from the Fourier transform of the *P(r)* 26 expansion. 27 27 28 where $I_{meas}(Q_i)$ is the measured scattering intensity and $I_{th}(Q_i)$ is 29 the prediction from the Fourier transform of the *P(r)* expansion. 30 31 The $Reg\_term$ term is a regularization term set to the second derivative 32 $d^2P(r)/d^2r$ integrated over $r$. It is used to produce a smooth *P(r)* output. 28 The *Reg_term* term is a regularization term set to the second derivative 29 d\ :sup:`2`\ *P(r)* / dr\ :sup:`2` integrated over *r*. It is used to produce a 30 smooth *P(r)* output. 33 31 34 32 .. ZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZ … … 47 45 system. 48 46 49 P(r) inversion requires that the background be perfectly subtracted. This is50 often difficult to do well and thus many data sets will include a background.51 For those cases, the user should check the "estimate background" box and the52 module will do its best to estimate it.53 54 The P(r) module is constantly computing in the background what the optimum55 *number of terms* should be as well as the optimum *regularization constant*.56 These are constantly updated in the buttons next to the entry boxes on the GUI.57 These are almost always close and unless the user has a good reason to choose58 differently they should just click on the buttons to accept both. {D_max} must59 still be set by the user. However, besides looking at the output, the user can60 click the explore button which will bring up a graph of chi^2 vs Dmax over a61 range around the current Dmax. The user can change the range and the number of62 points to explore in that range. They can also choose to plot several other63 parameters as a function of Dmax including: I0, Rg, Oscillation parameter,64 background, positive fraction, and 1-sigma positive fraction.65 66 47 .. ZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZ 67 48 … … 74 55 .. ZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZ 75 56 76 .. note:: This help document was last modified by Paul Butler, 05 September, 201657 .. note:: This help document was last changed by Steve King, 01May2015 -
test/sasdataloader/test/utest_ascii.py
r7d94915 rb699768 94 94 f = self.loader.load("ascii_test_6.txt") 95 95 # The length of the data is 5 96 self.assertEqual(f, None) 96 self.assertEqual(len(f.x), 4) 97 self.assertEqual(f.x[0],0.013534) 98 self.assertEqual(f.x[3],0.022254) 97 99 98 100 if __name__ == '__main__':
Note: See TracChangeset
for help on using the changeset viewer.