{"responseHeader":{"status":0,"QTime":3,"params":{"q":"{!q.op=AND}id:\"102875\"","hl":"true","hl.simple.post":"","hl.fragsize":"5000","fq":"!embargo_tdt:[NOW TO *]","hl.fl":"ocr_t","hl.method":"unified","wt":"json","hl.simple.pre":""}},"response":{"numFound":1,"start":0,"docs":[{"file_name_t":"Soller-Automated_Detection.pdf","thumb_s":"/49/60/496043acd08331c96b885575f4fc67d6511f0f37.jpg","oldid_t":"compsci 10988","setname_s":"ir_computersa","restricted_i":0,"format_t":"application/pdf","modified_tdt":"2016-05-26T00:00:00Z","file_s":"/3a/a5/3aa5a0155bdfdc01ecd07f1ef9a9cb5030a32641.pdf","title_t":"Page 118","ocr_t":"103 use of conformal mappings provided an alternative to k-means cluster analysis, which determines the bases from the data distribution. Without applying conformal mappings, tensor product networks utilizing sine and Gaussian bases achieved comparable performance in approximating test functions. The sine bases produced better conditioned tensor product matrices than the Gaussian bases. The algorithm to find the optimal tensor product weights created a matrix formulation of the tensor products and solved a system of linear equations. The first step in creating the matrix formulation was the evaluation of the selected local basis function at different points along each of the input data dimensions. Kronecker product operations were applied to row vectors, which contained the basis function evaluation along corresponding dimensions. Each row resulting from the Kronecker product operations corresponded to one input vector from the training data set. Singular value decomposition solved the weighted least squares formulation of this problem. The computational complexity was lower than backward error propagation in multilayer perceptrons and comparable to learning in radial basis function networks. The singular value decomposition algorithm component required the largest number of computational operations, and the greatest computational improvement will result from more efficient least squares algorithms. The sine tensor product network was applied to the problem of detecting delirium given the laboratory data factors. The 25 laboratory values were not directly used because the number of resulting tensor product bases was larger than the number of data observations. The factor analysis also imposed a form of regularization on the solution. Using a weighted least squares formulation, the sine tensor product network achieved greater accuracy classification on the full data set than other artificial neural networks approaches, but achieved poor cross-validation accuracy on the test data. To address this characteristic of the STP network, this dissertation applied a second form of regularization by setting the lower magnitude singular values to 0. This regularized STP network achieved greater cross-validation accuracy on the test data than the other artificial neural network solutions. Overall, this dissertation has contributed to the fields of computer science,","id":102875,"created_tdt":"2016-05-26T00:00:00Z","parent_i":102961,"_version_":1642982670142013442}]},"highlighting":{"102875":{"ocr_t":[]}}}