TecnoLógicas

Journal Information
ISSN / EISSN : 01237799 / 22565337
Current Publisher: Instituto Tecnológico Metropolitano (10.22430)
Total articles ≅ 408
Current Coverage
INSPEC
DOAJ
Archived in
SHERPA/ROMEO
Filter:

Latest articles in this journal

Andrés F. Bravo-Montoya, Jefersson S. Rondón-Sanabria, Elvis E. Gaona-García
Published: 20 September 2019
TecnoLógicas, Volume 22, pp 185-194; doi:10.22430/22565337.1491

Abstract:This paper shows the vulnerabilities present in a wireless sensor network implemented over a long-range wide area network (LoRaWAN) LoRaWAN, and identifies possible attacks that could be made to the network using sniffing and/or replay. Attacks on the network were performed by implementing a protocol analyzer (Sniffer) to capture packets. The Sniffer was implemented using the RTL2832U hardware and visualized in Wireshark, through GNU-Radio. Tests showed that data availability and confidentiality could be threatened through replay attacks with LoRa server verification using HackRF One and GNU-Radio hardware. Although the LoRaWAN specification has, frame counters to avoid replay attacks, under given the right conditions, this measure could be violated even deny service to the node on the server.
Álvaro Espinel-Ortega, Adriana Vega-E
Published: 20 September 2019
TecnoLógicas, Volume 22, pp 171-183; doi:10.22430/22565337.1484

Abstract:When electrical engineering students start their instrumentation and measurement course, they have previously taken calculus, physics, probability, and statistics. However, they have problems to apply the knowledge they acquired to solve problems related to electrical measurements and variables in the profession, such as water flows, solar radiation, wind speed and water levels. This paper shows how to integrate all the concepts involved in the process to calculate measurement uncertainty in order to improve the way the results of measurements and/or error determination processes are described. For that purpose, this study presents an applied exercise and a methodological process by means of an example, where the value of a resistance is determined taking into account the data of voltage and current measurements and using few data. The objective is to focus the process on estimating Type A and Type B uncertainty and the factors that affect the measurement processes, such as uncertainty due to random variations of the measured signals, instrument defects, imprecision of the instruments, or their resolution. During the calculation of uncertainty proposed here, students use the probabilistic knowledge they have acquired after they determined the value of the uncertainty U from the combined uncertainty u
Luis Felipe Gaitán, Juan David Gómez, Edwin Rivas-Trujillo
Published: 20 September 2019
TecnoLógicas, Volume 22, pp 195-212; doi:10.22430/22565337.1489

Abstract:Distributed generation is one of the most accepted strategies to attend the increase in electrical demand around the world. Since 2014, Colombian government agencies have enacted laws and resolutions to promote and regulate the introduction of different generation technologies into the country’s electrical system. The incorporation of distributed generation systems into conventional distribution networks can cause problems if technical studies are not previously carried out to determine the consequences of the start of the operations of these new generation technologies. This scenario represents a new challenge for distribution networks operators because they must ensure that their systems can integrate these new generation sources without affecting the correct operation of the grid. In this article, the IEEE 13 nodes system is modified by incorporating the load curves of the three types of consumers in the Colombian electricity market into the model. Additionally, distributed generation systems from non-conventional sources of energy are integrated into two system nodes in order to perform a quasi-dynamic analysis of the different electrical variables, which can be used to determine the impact of these new technologies on a local distribution system. The voltage profiles and active and reactive power do not show considerable changes in the behavior of the electrical network; however, in the simulation scenarios where distributed generators are operating, the system exhibits a considerable increase in lines losses. There are two alternatives to manage these unusual levels in the operation of the nodes with distributed generation: (1) operating these new DG nodes in islanded mode or (2) strengthening the local distribution system through the implementation of new distribution lines in the network.
Cristhian D. Molina-Machado, Ernesto Cuartas, Juan D. Martínez-Vargas, Eduardo Giraldo
Published: 20 September 2019
TecnoLógicas, Volume 22, pp 233-243; doi:10.22430/22565337.1344

Abstract:This paper proposes a comparative analysis between regular and parallel versions of FISTA and Tikhonov-like optimizations for solving the EEG brain mapping problem. Such comparison is performed in terms of computational time reduction and estimation error achieved by the parallelized methods. Two brain models (high- and low-resolution) are used to compare the algorithms. As a result, it can be seen that, if the number of parallel processes increases, computational time decreases significantly for all the head models used in this work, without compromising the reconstruction quality. In addition, it can be concluded that the use of a high-resolution head model produces an improvement in any source reconstruction method in terms of spatial resolution.
Hernando A. Yepes, Carlos E. Arrieta, Andres A. Amell
Published: 20 September 2019
TecnoLógicas, Volume 22, pp 115-154; doi:10.22430/22565337.1105

Abstract:El aumento de la demanda energética, así como de las emisiones contaminantes ha generado un incremento de la investigación de tecnologías que permitan mitigar ambas problemáticas a nivel mundial. Dentro de las alternativas para mejorar la eficiencia de los procesos térmicos, el régimen de combustión sin llama se presenta como una de las alternativas más promisorias, puesto que permite obtener altos valores del rendimiento térmico mediante el mejoramiento de la transferencia de calor y del proceso de combustión, con la consiguiente reducción de las emisiones contaminantes. Debido a esto, en el presente estudio se realiza una revisión del estado del arte de dicha tecnología, haciendo énfasis en los aspectos fenomenológicos asociados, las principales características del régimen y su estabilidad, pasando por los mecanismos de obtención y la presentación de una serie de estudios, tanto a nivel nacional como internacional, en los que se utilizaron combustibles fósiles y alternativos. La revisión finaliza con la discusión de algunos casos en los cuales se ha implementado el régimen a nivel industrial.
María C. Quintero, Miryam Rincón, Jorge M. Osorio-Guillén, Diana López, Fernando Andrés Londoño-Badillo
Published: 20 September 2019
TecnoLógicas, Volume 22, pp 15-23; doi:10.22430/22565337.1269

Abstract:Piezoelectric materials are widely used in electronic devices and, traditionally, various lead-based materials have been implemented in such applications. However, because of the damage caused by lead, other materials with similar characteristics that do not cause a negative impact on human health and the environment have been developed. A material with those characteristics is potassium-sodium niobite K0.5Na0.5 Nbo3. In this study, we investigate the thermogravimetric, structural, and microstructural properties of powders of such system obtained through oxide mixing with the aim of establishing the effect and efficiency of grinding (using a horizontal and a planetary ball mill grinder) on the production of the final material. It was determined that horizontal grinding and calcination at 900°C create the optimal conditions for obtaining K0.5Na0.5 Nbo3 powders, by oxide mixing, with the adequate structure and microstructure to continue the densification and/or doping processes.
Edier Aristizábal-Giraldo, Mariana Vasquez Guarin, Diana Ruíz
Published: 20 September 2019
TecnoLógicas, Volume 22, pp 39-60; doi:10.22430/22565337.1247

Abstract:Existen diferentes métodos que permiten establecer a escalas regionales la susceptibilidad a la ocurrencia de movimientos en masa. Entre los métodos más utilizados se encuentran los métodos estadísticos bivariado y multivariado, los cuales exigen un inventario de procesos de remoción en masa. En el presente estudio se evalúa y zonifica la susceptibilidad por movimientos en masa en el norte de los Andes colombianos, región conocida como valle de Aburrá, por dos métodos estadísticos, uno de ellos bivariado, denominado Peso de la Evidencia, y recomendado por el Servicio Geológico Colombiano para estudios de amenaza en suelos rurales; y un segundo método estadístico tipo multivariado, denominado Regresión Logística, de amplio uso a nivel mundial. Para ambos casos, la construcción del modelo de susceptibilidad se realizó soportado en el histograma de frecuencias, correlación de Pearson, Análisis Discriminante y Análisis de Componentes Principales. Para evaluar el desempeño, la capacidad de predicción y los criterios de zonificación en alto, medio y bajo de cada uno de los métodos utilizados se utilizó el análisis ROC. Para la regresión logística se obtuvo un área bajo la curva del 76.8 % para el desempeño y 77.5 % para la capacidad de predicción, mientras que para el Peso de la Evidencia se obtuvo un 77.8% en el desempeño y 77.5% en la predicción, señalando resultados satisfactorios que permiten la incorporación de dichos resultados en los estudios básicos necesarios para la ordenación del territorio.
Laura Stella Vega-Escobar, Gloria M. Díaz-Cabrera
Published: 20 September 2019
TecnoLógicas, Volume 22; doi:10.22430/22565337.1516

Abstract:En el año 2016, el Departamento Administrativo de Ciencia, Tecnología e Innovación (Colombiano) – Colciencias elaboró y publicó la Política para mejorar la calidad de las publicaciones científicas, en la que uno de los objetivos principales es «aumentar la presencia de las revistas científicas nacionales en los índices citacionales y bases de datos que las comunidades científicas de las diferentes disciplinas reconocen como espacios para la divulgación de resultados de investigación con alto impacto científico». En pro de este objetivo, Colciencias rediseñó el modelo de clasificación de las revistas científicas nacionales, que determina la indexación en el Índice Bibliográfico Nacional IBN-Publindex [1].
Sebastián Sebastián, Adela M. Ceballos-Peñaloza, Luis F. Gutiérrez-Mosquera
Published: 20 September 2019
TecnoLógicas, Volume 22, pp 25-38; doi:10.22430/22565337.1117

Abstract:La refrigeración y la congelación son procesos tradicionalmente utilizados en la conservación de alimentos. En el caso del açaí, una fruta brasilera, estas tecnologías corresponden a los métodos más empleados por la industria. La aplicación de modelos teóricos para la predicción de tiempos y temperaturas de congelación permite mantener un control eficaz de dicha operación. El objetivo de este trabajo fue estimar y evaluar el tiempo, temperatura y velocidad de dos procesos de congelación para pulpa de açaí, tanto convencional como criogénico. El tiempo de congelación convencional para el producto fue 153,68 ± 6,42 min, con mínimo error respecto a las ecuaciones de Pham (0,54 %) y Nagaoka (1,71 %). Para la congelación rápida, el tiempo fue de 100,56 ± 17,90 s (1,68 ± 0,29 min), siendo representado eficazmente por el modelo de Nagaoka (6,81 %). Las temperaturas de inicio de la congelación fueron, para la operación lenta y rápida, de -0,64 ± 0,02 °C y -2,91 ± 0,86 °C. Por medio de estos resultados, es posible obtener y validar correlaciones que permitan predecir efectivamente los tiempos de congelación de la pulpa de açaí fresca, al considerar los principales parámetros del proceso: geometría, temperatura del medio frio y composición del alimento.
Jorge Luis Bacca, Henry Arguello
Published: 20 September 2019
TecnoLógicas, Volume 22, pp 1-14; doi:10.22430/22565337.1205

Abstract:Spectral image clustering is an unsupervised classification method which identifies distributions of pixels using spectral information without requiring a previous training stage. The sparse subspace clustering-based methods (SSC) assume that hyperspectral images lie in the union of multiple low-dimensional subspaces. Using this, SSC groups spectral signatures in different subspaces, expressing each spectral signature as a sparse linear combination of all pixels, ensuring that the non-zero elements belong to the same class. Although these methods have shown good accuracy for unsupervised classification of hyperspectral images, the computational complexity becomes intractable as the number of pixels increases, i.e. when the spatial dimension of the image is large. For this reason, this paper proposes to reduce the number of pixels to be classified in the hyperspectral image, and later, the clustering results for the missing pixels are obtained by exploiting the spatial information. Specifically, this work proposes two methodologies to remove the pixels, the first one is based on spatial blue noise distribution which reduces the probability to remove cluster of neighboring pixels, and the second is a sub-sampling procedure that eliminates every two contiguous pixels, preserving the spatial structure of the scene. The performance of the proposed spectral image clustering framework is evaluated in three datasets showing that a similar accuracy is obtained when up to 50% of the pixels are removed, in addition, it is up to 7.9 times faster compared to the classification of the data sets without incomplete pixels.