Maestría en Ciencias de la Información y las Comunicaciones

URI permanente para esta colecciónhttp://hdl.handle.net/11349/18

Examinar

Envíos recientes

Mostrando 1 - 20 de 298
  • Ítem
    Zonificación ambiental de distritos de manejo integrado usando el método AHP
    (Universidad Distrital Francisco José de Caldas) Ruiz Morales , Tatiana Lorena; Melo Martínez, Carlos Eduardo; Melo Martínez, Carlos Eduardo [0000-0002-5598-1913]
    This project seeks to build a Methodological Guide based on the Analytic Hierarchy Process - AHP method, for the generation of environmental zoning of protected areas, specifically for the category of Integrated Management Districts, as one of the components required in the formulation and updating of Management Plans, in accordance with the provisions of Decree 2372 of 2010 and Decree 1076 of 2015 issued by the Presidency of the Republic. of Colombia, in relation to the National System of Protected Areas.
  • Ítem
    Modelo de asignación de recursos indivisibles con restricciones de exclusividad aplicando agentes autónomos mediante negociación distribuida
    (Universidad Distrital Francisco José De Caldas) Hernández Montealegre, Daniel Alexander; Barón Velandia, Julio; Barón Velandia, Julio [0000-0002-9491-5564]
    This research project proposes the design of a model for allocating indivisible resources with exclusivity constraints through the application of autonomous agents that negotiate in a distributed manner. The initiative arises as a response to the need to represent scenarios in which resources cannot be freely divided or shared among all entities and where fairness in resource distribution must be considered. The model is developed on a distributed architecture using autonomous agents, with negotiation protocols that allow agents to reach agreements while respecting their individual constraints and objectives. Through an iterative and incremental methodological approach, the allocation model is built, the most suitable negotiation protocols are established, and scenarios are simulated to evaluate the relevance of the proposed approach. The aim is for the results to help bridge the gap between fair allocation theory and distributed negotiation protocols, offering a flexible, fault-tolerant, and conceptually sound option applicable to dynamic environments.
  • Ítem
    Diseño de modelo arquitectónico asíncrono para la implementación de servicios de computación sin servidor en la infraestructura de nube privada de ifx networks
    (Universidad Distrital Francisco José de Caldas) González Angarita Breitner Enrique, Breitner; Gaona Garcia, Paulo Alonso; Gaona Garcia, Paulo Alonso [0000-0002-8758-1412]
    Serverless computing is an architectural model where software developers do not have to worry about the infrastructure or the management of the environment where their applications run, focusing solely on developing the business logic of their solutions. IFX Networks is a company that provides its clients with telecommunications services, Data Centers (IaaS), and public and private cloud services, among others. However, serverless computing services have not yet been implemented due to the costs associated with using such services through public cloud providers. In this context, this project proposes the development of an asynchronous architectural model for the implementation of serverless computing services within IFX Networks’ private cloud infrastructure. This model enables the company's development and automation center to leverage its infrastructure by adopting this approach as a modern software development strategy for its software factory team, optimizing costs and improving software construction efficiency. The goal of this project is to design an asynchronous model for the implementation of serverless services in IFX Networks’ private cloud. To achieve this, a Micro Frontend architecture with Role-Based Access Control (RBAC) was used in the presentation layer, offering multiple benefits such as security, modularity, autonomy, and scalability (Kurapati, 2024). This provides the solution with a robust shared authentication and authorization model. Additionally, a Web Application Firewall (WAF) was implemented between the Frontend and Backend components to establish a reliable security barrier for serverless function code traffic within the internal network. This protects the solution from attacks such as SQL injection, cross-site scripting (XSS), among others (Anwar, Abdullah, & Pastore, 2021). The architecture follows an asynchronous approach using middleware with the AMQP (Advanced Message Queuing Protocol), improving the system's scalability and fault tolerance (Ćatović, Buzađija, & Lemeš, 2022).
  • Ítem
    Metodología para la detección y clasificación en tiempo real de muescas de tensión
    (Universidad Distrital Francisco José de Caldas) Caicedo Navarro, Joaquín Eduardo; Rivas Trujillo, Edwin; Meyer, Jan; Rivas Trujillo, Edwin [0000-0003-2372-8056]
    Voltage notches, characterized by short-duration, periodic deviations in voltage waveforms often accompanied by high-frequency oscillations, represent a critical yet understudied type of Power Quality Disturbance (PQD) prevalent in industrial low-voltage networks. Caused by the normal operation of line-commutated power converters, these disturbances can trigger resonances, cause equipment malfunction, and accelerate aging, leading to significant economic losses. This thesis proposes and validates a novel methodology for real-time detection, classification, and severity assessment of voltage notches, addressing substantial gaps identified through a comprehensive literature review. A systematic review revealed that existing approaches offer limited focus on both non-oscillatory and oscillatory notches and lack robust, computationally efficient detection and classification methods. To overcome these limitations, a flexible MATLAB/Simulink-based simulation platform was developed to synthetically generate voltage and current waveforms associated with commutation notches. By systematically varying parameters such as short-circuit power, firing angle, snubber circuits, and background voltage unbalance and distortion, the platform generates realistic signals validated against field measurements, thereby providing a solid foundation for the development of the methodology. The novel methodology integrates the Space Phasor Model (SPM) and Fourier Descriptors (FDs) within a deep autoencoder framework, trained exclusively on notch-containing signals. FDs are computed from the SPM of voltage and current, and characteristic FDs, derived from pulse rectifier harmonic theory, are identified as reliable spectral signatures for notch detection. A novel anomaly detection strategy, based on autoencoder reconstruction errors (where notch signals exhibit low reconstruction error and non-notch signals yield higher error), yields over 99% accuracy on simulated data and 96% on field measurements. Subtype classification (non-oscillatory vs. oscillatory) is performed using a linear support vector machine trained in the autoencoder’s latent space, achieving over 97% accuracy. Severity assessment is performed using physically interpretable indices derived from the SPM radius reconstructed exclusively with characteristic FDs (“notch-only” SPM), allowing classification into mild, moderate, significant, or severe categories. These insights support targeted mitigation strategies. Finally, real-time performance evaluations confirm that the comprehensive detection, classification, and severity assessment pipeline consistently operates within a single power frequency cycle, even at high sampling rates, demonstrating the methodology’s suitability for embedded or edge-computing applications. Overall, this thesis significantly advances PQD analysis by delivering a robust, interpretable, and computationally efficient framework for voltage notch detection and classification.
  • Ítem
    Metodología para la evaluación de los incendios de la vegetación en la Sierra de la Macarena utilizando técnicas de análisis de imágenes basadas en objetos geográficos y árboles de decisión
    (Universidad Distrital Francisco José de Caldas) Cabezas MartÍn, Gonzalo; Daza Medina, Rubén Javier; Daza Medina, Rubén Javier [0000-0002-9851-9761]
    This study evaluates the severity and impact of forest fires in the Sierra de la Macarena, Meta, during the period 2018–2024, using Sentinel-2 MSI imagery and advanced geospatial analysis techniques. Through an object-based image analysis (GEOBIA) approach combined with the implementation of decision trees and sequential pattern mining, affected areas were classified, allowing not only the quantification of fire severity but also the identification of vegetation regeneration patterns over time. The results provide crucial insights into how fires have transformed the ecosystem and how it has responded over the years, offering a solid basis for post-fire planning and management in this important natural park. This comprehensive approach enabled continuous and updated monitoring of fire impacts, supporting the conservation of one of Colombia’s most biodiverse ecosystems.
  • Ítem
    Metodología para la gestión de información agricola implementando cubos de datos para el fortalecimiento de aplicaciones espaciales con machine learning
    (Universidad Distrital Francisco José de Caldas) Montaño Moreno, Andrés Orosman; Rocha Salamanca, Luz Ángela; Valbuena Gaona, Martha Patricia; Rocha Salamanca, Luz Ángela [0000-0001-5274-4819]
    Contemporary agriculture faces challenges arising from climate change, economic pressures, and population growth that require the adoption of techniques and systems capable of improving productivity without compromising environmental sustainability or food security. In this context, satellite observations together with machine learning methods can contribute to monitoring and decision-making; however, their practical application is limited by input heterogeneity, the lack of interoperability standards, and radiometric variability among products. This thesis addresses these limitations by proposing a methodology for the management of agricultural information through multitemporal raster data cubes. The proposed methodology defines a modular workflow: requirements definition and field data capture (FieldMaps); acquisition of multitemporal series (PlanetScope and Sentinel-2); preprocessing and normalization (TOA and surface reflectance, atmospheric corrections with ENVI); calculation of spectral indices (NDVI, GNDVI, CLGreen, TVI, among others); spectral segmentation (mean-shift); and assembly of the raster cube in ArcGIS Pro. The cube organizes information by pixel and date, enabling temporal queries and the systematic extraction of training vectors for regression and classification models. Reference meteorological variables (e.g., NASA-POWER) are also incorporated to complement the inputs, with emphasis on their use as auxiliary data. Validation was performed through two case studies. The first involved estimating the phenological stage of onion in Tota (Boyacá), using 17 PlanetScope scenes (Dec 2023–May 2024) and a field control point recorded on 19 May 2024; linear regression, a multilayer perceptron (MLP) neural network, and Random Forest were compared, with the MLP obtaining the best results (R² = 0.91, MSE = 4.07). The second study addressed detection and classification of agricultural cover in prioritized areas (Putumayo, Guaviare, and Antioquia), comparing Random Forest and the Spectral Angle Mapper (SAM); Random Forest showed higher overall accuracy (94.4%), Kappa = 0.84, and recall for the “Crop” class close to 96%. The analysis highlights that organizing inputs into raster cubes contributes to greater spatial and radiometric coherence among sources and facilitates experimental repeatability. Nonetheless, practical limitations were identified: direct inclusion of meteorological variables in the models produced signs of overfitting in some cases; the availability of Surface Reflectance (SR) or Analysis Ready Data (ARD) products improves spectral consistency; and discrimination of very similar species may require inputs with higher spectral resolution. Consequently, cautious use of auxiliary variables is advised, along with prioritization of SR/ARD products and evaluation of hyperspectral inputs when discrimination requirements justify them. Overall, the document presents a modest technical proposal applicable to operational contexts, accompanied by empirical evidence and practical recommendations for its implementation and scaling in settings with varying resources and capacities.
  • Ítem
    Modelo de clasificación de ímagenes basado en multiview learning
    (Universidad Distrital Francisco José de Caldas) Díaz Monje, Kevin Cmailo; Gelvez García, Nancy Yaneth
    This work proposes and develops an image classification model based on Multiview Learning (MVL). Its scope focuses on model generation and validation to improve the accuracy and relevance of content visualization systems. The project integrates various image views—convolutional neural networks, metadata analysis, object detection, and user feedback—to provide more robust, contextualized, and meaningful classifications. The case study focuses on images related to the armed conflict in Colombia, given the cultural, semantic, and historical complexity that this type of material entails. In this way, the model not only addresses a technical challenge but also a social one, contributing to a more accurate interpretation of sensitive content in immersive environments. Regarding its importance, the project contributes to methodological innovation by overcoming the limitations of traditional image classification with a one-dimensional approach by integrating multiple analytical perspectives. On a practical level, by enhancing the relevance and coherence of visual content in applications such as digital museums, augmented/virtual reality educational environments, and interactive cultural platforms. And on the level of academic and social impact, by opening up the possibility of designing systems that are more sensitive to the cultural and historical context, strengthening memory building and the user experience. The results achieved, with accuracy metrics above 80% and a high level of user satisfaction in terms of adaptability and customization, demonstrate that the proposed model is a significant contribution both to research on classification issues and to practical implementation in culturally and socially relevant visualization systems.
  • Ítem
    Metodología machine learning para el tratamiento de imágenes computarizadas en pacientes con cancer de pulmon
    (Universidad Distritral Francisco José de Caldas) Cely Granados , Oscar Leonardo; Salcedo Parra, Octavio José; Salcedo Parra Octavio José [0000-0002-0767-8522]
    This research proposes a computational methodology aimed at identifying patterns associated with lung cancer, using exclusively machine learning and deep learning tools implemented in Python. The study is based on the analysis of the LIDC-IDRI dataset (The Lung Image Database Consortium and Image Database Resource Initiative), provided by the U.S. National Cancer Institute, which contains medical images in DICOM (Digital Imaging and Communications in Medicine) format. DICOM is the international standard for the transmission, storage, and processing of medical images, allowing the integration of patient information, acquisition characteristics, and the image itself into a single file. In addition to DICOM images, the dataset includes radiologist segmentations, nodule counts, and clinical diagnoses in structured files. This methodology focuses on the processing, integration, and analysis of large volumes of data, with the aim of exploring significant correlations and behaviors within the available variables. Although the purpose is not to provide direct clinical diagnosis, the patterns identified could serve as a basis for future research and support the development of diagnostic assistance systems. Each patient can generate between 10 and 15 GB of information, which poses relevant challenges regarding efficient processing, organization, and data interpretation. This work seeks to contribute to strengthening computational analysis applied to lung cancer, from an engineering, exploratory perspective, centered on leveraging complex medical data.
  • Ítem
    Metodología para el monitoreo de la subsidencia del suelo en la ciudad de Bogotá D.C con técnicas de interferometría y persistent scatterers
    (Universidad Distrital Francisco José de Caldas) Suarez Jaimes, Paola Andrea; Suárez Torres, Edilberto; Suárez Torres Edilberto [0000-0002-7582-1108]
    The subsidence phenomenon is a vertical movement of the Earth's surface, triggered by several factors, such as fluid extraction, ground compaction, mineral exploitation, reservoir exploitation, etc. This phenomenon has been occurring in countries such as Spain, the United States, Italy, Mexico, China, and Colombia, causing ground fractures and damage to urban infrastructure, as well as environmental, social, and economic repercussions. In the city of Bogotá, subsidence has been demonstrated for the period between 2006 and 2008, evidencing deformations and subsidence of up to 7.5 cm/year. The localities of Puente Aranda, Engativá, Teusaquillo, and Fontibón were the most affected (Blanco, Barreto, & Dulfay, 2009). These studies have been carried out using differential interferometry. This technique allows deformations to be detected between two image capture moments, but the capital district does not have tools or algorithms that capture time trends for continuous monitoring (Bogotá, 2016). Given that subsidence studies have been conducted in the city of Bogotá in recent years, revealing a strong subsidence trend in some areas of the city, the need arose to implement one of the most recent methods in the world that considers time series interferometry, called Persistent Scatteres, which allows continuous monitoring of surface deformation. This project seeks to conduct the first Persistent Scatteres tests in the city of Bogotá to enable viable methodological alternatives for monitoring subsidence in the city of Bogotá using open-source data.
  • Ítem
    Análisis de datos y desarrollo de un sistema de inteligencia artificial para la detección de la tasa de aprobación de visas a estados unidos
    (Universidad Distrital Francisco José de Caldas. Revista prospectiva, Universidad autonoma del caribe.) Novoa Peralta, Leonardo Andrés; Medina Garcia, Victor Hugo; Medina García, Victor Hugo; Medina García Victor Hugo [0000-0002-0989-8676]
    This article presents an analysis of the profiles of individuals who are approved or denied tourist visas in Colombia using data mining through the CRISP-DM methodology and computational intelligence, taking into account 2 crucial factors. 1. The denial rate is approximately 46%, which means that Colombians spend nearly two million dollars on rejected visas. 2. Visa interview appointments are delayed by two years after the emergence of COVID-19. supported too by other investigations like Prateek and Karun who used classification algorithms to detect profiles and predict results of study visas , our analysis obtains information about the patterns and common characteristics among applicants who have obtained approval for their tourist visa, such as their age, gender, nationality, marital status, profession, among other aspects related in the application forms (DS-160). The analysis concludes with the development of an AI calculator capable of predicting the approval probability with an effectiveness of over 85%. Ideal for applicants who could see which points are relevant to improve or simply not show up and wait for the right moment.
  • Ítem
    Mejoramiento de la descripción de recursos educativos abiertos, a partir de técnicas basadas en inteligencia artificial, machine learning y minería de datos
    (Universidad Distrital Francisco José de Caldas) Cotta García, Juan Guillermo; Herrera Cubides, Jhon Francined; Gaona García, Paulo Alonso; Herrera Cubides, Jhon Francined; Herrera Cubides Jhon Francined [0000-0003-1615-4656]
    Although Open Educational Resources (OER) are fundamental for teaching, learning, and research thanks to open access policies and web tools their potential is often limited. Despite their exponential growth, many OERs are published with low-quality or incomplete metadata descriptions, which hinders their discovery, retrieval, and effective reuse in open digital repositories, leading to issues of ambiguity and inconsistency. To address this problem, a strategy is proposed based on machine learning techniques (Large Language Models - LLMs) and embedding techniques (vector representations) for semantic capture, aimed at improving the metadata elements that describe OERs. This strategy seeks to uncover new details that provide a better description of a resource, thereby maximizing the potential of OERs across various open digital repositories.
  • Ítem
    Modelo de gestión de información espacial para autorregular el acceso a estaciones de Transmilenio utilizando herramientas colaborativas
    (Universidad Distrital Francisco José de Caldas) García Guzmán, Pedro Pablo; Rocha Salamanca, Luz Ángela; Rocha Salamanca, Luz Ángela [0000-0001-5274-4819]
    The use of technology is becoming more accessible to everyone every day. Its use, focused on improving the quality of life of public transportation users, has generated the need to propose a solution that allows users to determine the congestion status of service stations when they need to use the service. This solution offers an alternative that allows them to decide the appropriate time to use the service, either by moving to another station or by knowing the congestion status of a station, and deciding whether to delay or advance their entry. The initial focus is on the TransMilenio system, but its use is not limited to other types of existing services or those that may be implemented in the future. Likewise, the environment chosen to conduct the study of the problem does not necessarily limit the use of the proposed solutions elsewhere.
  • Ítem
    Metodología para implementar un modelo Hiles de una WSN
    (Universidad Distrital Francisco Jose de Caldas) Idrobo López, Oscar Javier; Muñoz, Gerardo; Idrobo Lopez, Oscar Javier [0009-0001-3236-3223]
    This thesis presents a methodology for implementing a HiLeS model within the context of Wireless Sensor Networks (WSN), aiming to establish a bridge between the conceptual representation of HiLeS elements and their practical implementation. First, a textual representation of HiLeS elements is developed, providing a formal structure to capture their characteristics and relationships. Subsequently, a visualization of this textual representation is proposed to facilitate comprehension and analysis of the model. The methodology further includes the translation of the textual representation into MicroPython, enabling execution in embedded hardware environments. To validate the approach, a problem within the context of a WSN is defined and logically simulated. Based on this simulation, a set of validation tests is conducted to verify the expected network behavior and its consistency with the HiLeS model. Finally, the results are analyzed, demonstrating the feasibility of the methodology and its potential application in the design and validation of WSN-based solutions.
  • Ítem
    Sistema IoT para la gestión de variables de proceso en ambientes de refrigeración de la industria cárnica
    (Universidad Distrital Francisco José de Caldas) Triana Useche, Jordan Camilo; Rodriguez Rojas, Luz Andrea; Rodriguez Rojas Luz Andrea [0000-0003-0312-1177]
    The food industry faces the challenge of ensuring product quality and safety throughout the supply chain, particularly in cold storage processes. Within this context, the meat industry requires highly efficient refrigeration systems, as parameters like temperature and relative humidity directly impact product safety and shelf life (Pardo Martínez & Cotte Poveda, 2022). However, the lack of real-time monitoring mechanisms and energy optimization strategies has led to operational inefficiencies and risks of product spoilage (Ramírez-Faz et al., 2020). To address this issue, an IoT system was developed and implemented to manage and control critical variables in meat industry refrigeration systems. The primary goal is to optimize product preservation through continuous monitoring of temperature, relative humidity, and energy consumption, enabling data-driven decisions to ensure food safety (Díaz-Ruiz et al., 2019). The methodology involved integrating low-cost sensors connected to IoT devices, which transmit real-time data to a VPS server using the MQTT (Message Queuing Telemetry Transport) protocol. This infrastructure enables data visualization and analysis via the gAiA web platform, along with automated alerts for storage condition deviations (Wang et al., 2021). Predictive algorithms were also incorporated to identify energy consumption patterns and their impact on refrigeration efficiency (Han et al., 2021). Results demonstrate that this IoT system reduces energy consumption by 5%-10% without compromising storage quality. Additionally, data traceability supports preventive maintenance strategies and resource optimization, enhancing sector sustainability (Onoufriou et al., 2019). Due to its low cost and ease of implementation, this solution is a viable alternative for large-scale adoption in the meat industry, strengthening food security and reducing the environmental impact of refrigeration (Moran, 2024).
  • Ítem
    Desarrollo de un prototipo de modelo catastral 3d a partir de herramientas de geoprocesamiento para la visualización de los aspectos legales y físicos en una propiedad horizontal en el municipio de Soacha, Cundinamarca
    (UNIVERSIDAD DISTRITAL FRANCISCO JOSE DE CALDAS) Cumbe Loaiza, Laura Camila; Castillo Mendez , Luis Eduardo
    This work aims to record the development of the construction of a 3D cadastre model prototype supported by geoprocessing tools that complement geographic information systems (GIS) focused on horizontal property in the municipality of Soacha, in order to create different scenes or views of a co-property and thus have the opportunity to contribute to the development and various applications of cadastre information, representing the real estate right, demarcating the legal and physical aspects of the property.
  • Ítem
    Diseño de un prototipo en el motor de base de datos para la validación de reglas topológicas y cumplimiento del modelo de aplicación de levantamiento catastral LADM-COL v. 2.0.
    (Universidad Distrital Francisco José de Caldas) Lopez Calle , Maria Isabel; Ortíz Davila , Álvaro Enrique; Ortíz Davila, Álvaro Enrique [0000-0001-8830-1657]
    This research arises with the objective of designing a prototype that validates topological rules and ensures compliance with the cadastral survey application model LADM-COL V2.0 in the context of multipurpose cadastre and current regulations. Validation functions are integrated into the database engine, allowing the reporting of the state of cadastral information, generating diagnostics of the data, and facilitating controlled data management. The project considers the perspective of cadastral managers, who receive information from IGAC in R1 and R2 formats, as well as the cadastral geographic database. It is developed using PostgreSQL, an open-source tool, and as a result, topological validation functionality for cadastral databases will be obtained. Additionally, it will include a module for migrating IGAC cadastral information to the survey application model, with results illustrated through a practical case study.
  • Ítem
    Modelo para la evaluación de la calidad intrínseca y dinámica de los datos abiertos sector salud en Bogotá D.C. basado en machine learning
    (Universidad Distrital Francisco José de Caldas) Varon Capera , Álvaro; Gaona García , Paulo Alonso; Gaona García, Paulo Alonso [0000-0002-8758-1412]; Varpn Capera, Álvaro [0009-0005-4134-7304
    The purpose of this master's thesis document, under the in-depth modality, is to conduct a review of the quality status of these historical data, specifically through open data repositories in the health sector within the context of the city of Bogotá, in order to assess their quality. To this end, the quality of open data is evaluated based on criteria related to consistency, accuracy, redundancy, update frequency, among others. Evaluating the quality of open data displayed in repositories would facilitate the reuse of the dataset to carry out studies focused on the prevention of epidemiological events, impacts on the provision of health services, and declines in overall health for a population.
  • Ítem
    Metodología para la elección de modelos de LLMS en aplicaciones de predicción metereológicas a través de algoritmos de ML sobre entornos de computación en la Nube y capturas de datos a través de IoT
    (Universidad Distrital Francisco José de Caldas) Bello González, Iván Darío; Gaona García, Elvis Eduardo; Gaona García,Elvis Eduardo [0000-0001-5431-8776]
    Meteorological prediction is one of the critical factors addressed from various approaches and is fundamental for a wide range of sectors, such as agriculture, renewable energy, disaster management, and urban planning. Recent advances in Large Language Models (LLMs), Internet of Things (IoT), and cloud computing have opened new opportunities to improve the accuracy and efficiency of predictions in these sectors. However, there are several challenges related to the constant variability of environmental conditions and the reliability of data obtained from sensors. This research proposes the development of a comprehensive methodology to evaluate the impact of integrating LLMs with IoT infrastructures and cloud computing, with the aim of determining precision and improving the accuracy of meteorological predictions. The methodology comprises five iterative phases: Identification, Development, Testing and Monitoring, Evaluation, and Analysis. This approach allows for the continuous evaluation of LLMs and the adaptation of the system based on the obtained results, addressing the changing needs of the IoT environment. The study focuses on designing specific metrics to evaluate the performance of LLMs compared to traditional models, deployed within a scalable cloud platform that facilitates the integration of data generated by IoT devices. The methodology incorporates the use of a ReAct (Reasoning and Acting) agent, which improves the system's precision and accuracy by detecting anomalies in the data and adjusting responses accordingly. This agent also demonstrated the ability to identify when the model's performance was insufficient, recommending the use of more reliable data sources as an alternative to ensure the quality of predictions. In the case study, it was evident that some models exhibited low performance, with metrics such as R² close to zero, indicating an inability to capture underlying patterns in the data. However, the inclusion of the ReAct agent mitigated these problems by making critical decisions to maintain the quality of predictions. The results demonstrated the system's ability to adjust and improve as new data is collected, making the process adaptive and more robust. It is expected that the results of this research will significantly contribute to the advancement of meteorological prediction, with direct benefits for critical sectors and various stakeholders. The developed methodology lays the foundation for future research and applications in this field, facilitating more accurate and reliable meteorological predictions. The combination of LLMs with IoT and reactive agents not only enhances predictive capability but also the system's adaptability in changing environments, which is essential for modern meteorological applications.
  • Ítem
    Diseño de un metamodelo de software para el desarrollo de aplicaciones inmersivas
    (Universidad Distrital Francisco José de Caldas) Sánchez Cruz, Andrés Felipe; Gelvez García, Nancy Yaneth; Gelvez García,Nancy Yaneth [0000-0003-3334-6959]
    Users are the main purpose of software development as the process turns around their needs. The goal is always to satisfy them in the best possible way, but often aspects that add value to the software and improve the user experience are over-looked, commonly in immersive scenarios. This is evidenced by the fact that in many cases virtual environments are not suitable, which has underlying issues of usability and interactivity. This document presents a study focus on the design on metamodel based on usability and interactivity features in immersive applications such as virtual, augmented, or mixed reality, as this type of visualization is partic-ularly sensitive to factors that can harm the user experience due to its nature. The sections of the document include the methodology proposal, research on existing pro-jects that use the mentioned features, the design of a metamodel, and the devel-opment of a prototype based on it. Finally, the prototype will be evaluated through a specifically designed test to validate the work done.
  • Ítem
    Metamodelo software para simulación remota interactiva en ambiente Web
    (Universidad Distrital Francisco José de Caldas) Piñeros Ramírez, Jeisson Rodrigo; Barón Velandia, Julio; Vanegas Ayala, Sebastián Camilo; Piñeros Ramírez, Jeisson Rodrigo [0009-0006-6809-3292]; Barón Velandia, Julio [0000-0002-9491-5564]; Vanegas Ayala, Sebastián Camilo [0000-0002-8610-9765]; García Barreto, Germán Alberto (Catalogador)
    Simulation plays a crucial role in supporting the understanding of real-world phenomena; interactive simulation with progressive event visualization is even more valuable for this purpose. However, certain types of systems require significant computational resources, in terms of both memory and processing. A possible solution to this problem is to conduct simulations in a Web environment, allowing the client to perform progressive visualization of the results obtained from the server-side logic processing. Current simulation solutions present challenges for client devices, including low user interactivity on the client device and high consumption of both bandwidth and computational resources for processing simulation logic. This often makes such solutions limit interactivity or become costly in terms of processing, memory, and network usage. How can computational and network requirements on the client be reduced while maintaining high levels of interactivity and progressive visualization in simulation software? The objective of this proposal is to design a software metamodel for interactive remote simulation with progressive visualization in a Web environment by applying a methodological technique based on iterative and incremental models, which allows for obtaining results in both conceptual and development terms.