Global Journal of Computer Science and Technology, C: Software & Data Engineering, Volume 22 Issue 2

Figure 12: Scrum from the Pilot in the Integration from the Big Data As a result, with regard to the business areas, the user is provided with data with a wide possibility of exploitation, complying with the 7 vs. The BBVA Api Market portal (BBVA et al., 2020) also considers Big Data characteristics: Volume, Speed, Variety, Value, Truthfulness, Variability, Visualization. It is important to shut down isolated systems and databases because to the overflow related to the infrastructure, maintainability and the mismanagement of the data causing the information cannot be truthful and little by little to lack value. With the optimized campaign process, by deploying it in Big Data, the financial entity becomes competitive with other entities in the market for customers. The Data Scientists were not 100% dedicated to the development of the pilot, this complicated many meetings for defining the rules, which in the long term, caused rework; This occurs because the respective business management areas did not want to neglect their daily operations by giving up their best resources. Although the Data Scientists were not dedicated to the pilot, the commitment and motivation was always high and it was reflected in the rapport of the team which is vital for the success of any project. Having a Big Data environment and a governed Data Lake provide the scenary for other types of non- financial analysis in these banking entities, but also for analysis of corporate social responsibility and / or social networks. (López et al., 2018) V. C onclusions The integration of the Big Data environment in the financial management contributed to the efficient generation of products and services for the Business Development area which also optimizing its processes, internal services and decision-making at the management level. With the Big Data environment, the business areas will develop advanced analytics for visualizing and predicting the offers from new products, services, customer segmentation and behavior for credit campaigns. Inducing the business areas to be independent from the Engineering area (which was previously responsible for all development), because they will have their own Sandboxes for their own developments and a governed Data Lake. The governed Data Lake provides us too much volume for making decisions, in addition the information is truthful by the implanted data government which has the consensus from different business areas due to the value and variety which predominate in the Data Lake. Finally, distributed processing and storage allow the speed from access to information and generation from reports in line is according with the competition, when users have the capabilities of Business Intelligence and Big Data. The architecture is very flexible because to its hybrid nature (which contains open code components in its structure) adopting to the new market trends, for example, there are agreements with Google to integrate components such as: Kubernetes; or with Amazon Web Service to use your endpoints and improve the access to the services from our Big Data environment. All these components allow us to expand our knowledge and enrich the role of Data Architecture. The use of Spark allows optimizing the processing times, using traditional data engines because the times were very high. One of the campaign model processes took 24 hours to generate results, while with spark is only 60 minutes. R eferences R éférences R eferencias Integration of the Big Data Environment in a Financial Sector Entity to Optimize Products, Services and Decision-Making Global Journal of Computer Science and Technology Volume XXII Issue II Version I 50 Year 2022 ( ) C © 2022 Global Journals IV. D iscussion 1. Plase et al., 2017. A comparison of HDFS compact data formats: Avro Versus Parquet - Daiga Plase, Laila Niedrite, Romans Taranovs https://journals. vgtu.lt/index.php/MLA/article/view/500/357

RkJQdWJsaXNoZXIy NTg4NDg=