Apps

With a professional foundation in business, I have cultivated a deep-seated passion for information technology, complemented by hands-on experience in database systems, including both relational and NoSQL architectures. My technical proficiency includes advanced SQL query development, enabling efficient data management and analysis. Throughout my career, I have designed and implemented diverse applications using programming languages such as JavaScript, VBA, VBScript, and PHP. Additionally, I have engineered full-stack web solutions, utilizing modern server-side frameworks like Node.js and Deno to ensure robust performance. Below, you will find a curated selection of projects from my portfolio, showcasing my technical adaptability and problem-solving capabilities.

Automated Digital Solution for Past Service Pension Adjustment Notification

Following the approval of Past Service Pension Adjustments (PSPAs) by the Canada Revenue Agency (CRA), pension plan administrators are mandated to issue confirmation letters to affected members. Historically, this process required manual generation and physical distribution via Canada Post. However, unforeseen operational constraints necessitated an urgent transition to a fully digital, paperless notification system within a strict implementation timeline.

To address this requirement, the communications department developed a revised digital notification document, which was integrated into the organization’s online member document portal for secure member access. As part of this initiative, I designed a Power BI query to systematically identify approved PSPA cases with pending notification letters. This report was subsequently provided to the IT department to add the documents into the document repository. While initial implementation was straightforward, a technical limitation emerged: the repository’s predefined configuration restricted member visibility to specific document types, excluding the new PSPA notification. Consequently, each member’s document visibility setting required manual adjustment, a labor-intensive task given an average monthly volume of 250 cases.

Recognizing the opportunity to streamline operations, I conceptualized and developed an automated bot solution over a weekend. The bot was engineered to process an Excel-based member list, autonomously access the document repository via a web interface, and update the visibility setting for the PSPA document type to “viewable.” Leveraging JavaScript, Puppeteer, and NW.js technologies, the bot reduced manual effort significantly, operating efficiently in the background while allowing users to focus on parallel tasks. The solution demonstrated robust performance, processing 250 entries in approximately 10 minutes, thereby enhancing operational efficiency and ensuring compliance with procedural timelines.

This automation not only resolved immediate workflow challenges but also established a scalable framework for future digital process enhancements.

Automated Data Change Tracking Solution via HTML Application (HTA)

During the initial design phase of the Pension Adjustment (PA) and Past Service Pension Adjustment (PSPA) calculation system, a specific scenario requiring supplementary calculation rules was identified. Due to the low case volumes and unique complexity of this scenario, a strategic decision was made to exclude it from the automated calculator and instead flag such cases for manual processing. Over time, however, the volume of these exceptions experienced a gradual increase, exceeding 150 cases. While the manual calculation itself remained straightforward, the operational challenge shifted to managing these records, as modifications to any of 35 distinct input data fields could necessitate recalculation. Manually monitoring these 35 fields across 150+ member records on a monthly basis, particularly given their distribution across disparate files, presented a substantial operational burden, requiring labor-intensive cross-referencing to identify changes since the last review cycle.

To address this inefficiency, I engineered a customized HTML Application (HTA) leveraging a multi-technology architecture. The solution utilized Microsoft Access for persistent data storage, VBScript and SQL to perform systematic record comparisons between iterations, and a front-end interface (HTML, CSS, JavaScript) to visualize results dynamically. This application further incorporated functionality for annotating and retrieving case-specific comments, enhancing auditability. By automating the detection of field modifications, a process previously reliant on manual file reviews, the solution reduced a task requiring hours of effort to mere minutes. Critically, it also eliminated the risk of incomplete reviews inherent to the manual approach, ensuring comprehensive oversight.

This innovation not only resolved immediate workflow bottlenecks but also established a scalable framework for managing complex, data-sensitive processes with precision and efficiency.

Strategic Power BI Dashboard Development for Holistic Operational Oversight

While I have engineered numerous Power BI dashboards, the majority served as foundational tools for static data visualization. A significant advancement occurred through the design and implementation of an enterprise-level, logic-driven dashboard solution, purpose-built to consolidate cross-product insights and enhance daily operational decision-making.

Historically, our systems delivered data on an individualized member basis. This was optimal for singular product analysis but inadequate for enterprise-wide visibility. Though existing Power BI reports provided granular insights into specific product categories, they lacked the capacity to aggregate critical metrics such as global tax product status, comprehensive production volumes, unissued product inventories, compliance milestone adherence, and management-level analytics into a unified interface.

To bridge this gap, I spearheaded the development of an advanced Power BI dashboard following collaborative requirements gathering with stakeholders. Leveraging SQL integration, the solution synthesized data across four core domains: Pension Adjustments (PAs), Past Service Pension Adjustments (PSPAs), Pension Adjustment Reversals (PARs), and Tax Receipts. The dashboard featured a dynamic interface with a multi-functional navigation bar, enabling users to toggle between two primary analytical perspectives:

  1. Annual Production Overview: A year-to-date visualization categorizing products by issuance status (issued/unissued), empowering teams to assess production trends, forecast workload pipelines, and strategize resource allocation.
  2. Product-Specific Compliance Analytics: A compliance-centric view evaluating adherence to product-specific regulatory deadlines and requirements, with embedded conditional logic to accommodate varying rulesets across product types.

The dashboard supported dual data representation modes (tabular and graphical) and integrated contextual pop-up modules to clarify complex terminology and operational guidelines. A key innovation was its self-sustaining architecture: data refreshes occurred automatically upon user access via direct SQL connectivity, eliminating manual maintenance while ensuring real-time accuracy.

This solution not only transformed fragmented workflows into a centralized command center but also elevated organizational capacity to monitor compliance, optimize production cycles, and align tactical execution with strategic objectives.

Automated Pension Adjustment (PA) and Past Service Pension Adjustment (PSPA) Calculation Tool

Manual pre-issuance reviews of Pension Adjustments (PAs) and Past Service Pension Adjustments (PSPAs) historically posed inefficiencies due to time-intensive processes and the need for scenario-specific manual computations. To address these challenges, I engineered a robust Excel-based application leveraging VBA to automate and standardize PA/PSPA calculations.

The tool streamlined workflows by directly interfacing with the organization’s primary database to retrieve member-specific data and calculation inputs. It featured multi-functional capabilities, enabling users to execute calculations for individual products, product groups per member, or all products associated with a member. A key innovation was its transparent algorithmic breakdown, which displayed step-by-step computation logic to enhance user understanding and auditability. Additionally, the solution automated cross-referencing of results against system-generated values, flagging discrepancies for prioritized review.

This application became integral to daily operations, significantly accelerating review cycles while minimizing human error. Adopted widely across team members, it provided a scalable, user-friendly solution that transformed a previously cumbersome manual task into an efficient, standardized process, ensuring consistency and compliance across all PA/PSPA evaluations.

Power BI-Based Pension Adjustment Reversal (PAR) Calculation Solution

In response to a business requirement for a PAR calculation tool analogous to the existing Pension Adjustment (PA) calculator, I was tasked with developing a solution that avoided reliance on VBA due to limited internal expertise and long-term supportability concerns. While Power BI is traditionally leveraged for data visualization and retrieval, I engineered an innovative application of the platform to fulfill complex calculation requirements.

The solution harnessed Power BI’s capabilities to systematically extract and stage multiple calculation inputs (e.g., years of service, pensionable salary, and system-derived PAR values) in temporary memory storage. It then executed a multi-step computational workflow: retrieving time-bound service periods, applying regulatory logic to derive PAR values, and cross-referencing results against system-generated figures. Discrepancies exceeding predefined thresholds were systematically flagged for investigation. Unlike the Excel-based PA calculator, which supported multi-scenario analysis, this tool specialized in batch processing PAR calculations for specified periods—precisely aligning with operational needs.

To complement this automated process, I designed a supplemental documentation framework using Report Builder. This template, integrated with the Power BI calculator, enabled manual calculations for edge cases incompatible with automated processing. The template standardized audit-ready documentation, allowing users to generate PDF outputs for permanent storage with member digital files, ensuring compliance with record-keeping protocols. This dual-component solution achieved three critical objectives:

  1. Operational Continuity: Eliminated VBA dependency while maintaining calculation integrity.
  2. Process Efficiency: Automated bulk PAR computations with built-in validation, reducing manual effort.
  3. Regulatory Compliance: Ensured traceability through systematized documentation, mitigating audit risks.

By repurposing Power BI beyond its conventional use case, the project demonstrated the platform’s adaptability in addressing niche calculation challenges while adhering to scalability and user accessibility requirements.

Journal Entries Documentation and Data Tracking Application

The allocation of employer contributions was automated through system integration, triggered upon payment entry. This process simultaneously generated corresponding journal entries for each transaction. However, post-implementation analysis revealed that full automation of journal entries was unfeasible, necessitating a supplemental manual workflow. At month-end, manual entries were documented in an Excel spreadsheet, capturing debits, credits, and client details. While initial spreadsheet setup was straightforward, tracking outstanding transactions monthly required substantial manual oversight.

To enhance efficiency, I designed an Excel-based solution using VBA (Visual Basic for Applications) to interface with the organizational database. Upon execution at month-end, the application automates data retrieval, populates preformatted journal entry templates, generates an email with attached spreadsheets, and routes it to Corporate Accounting for processing—all via a single user command. Corporate Accounting’s complementary tool then parses the spreadsheet data for seamless integration into the General Ledger (GL).

This solution significantly reduced administrative burdens by eliminating repetitive tasks, minimizing manual errors, and enabling staff to focus on analytical and strategic initiatives.

Confluence-Based Calendar Application

Historically, team leaders independently managed their teams’ working hour allocations. Over time, cross-functional visibility into schedules—including working hours and vacation plans across departments—emerged as an operational priority. To address this need, I was assigned to set a centralized calendar solution. Initial evaluations of existing tools, such as Microsoft Exchange and enterprise calendar systems, revealed critical limitations: some lacked robust security protocols (e.g., unrestricted editing permissions), while others imposed overly rigid constraints on user input.

To resolve these challenges, I spearheaded the development of a secure, customizable calendar within Confluence, the organization’s primary knowledge management and project documentation platform. Capitalizing on Confluence’s native support for HTML, CSS, and JavaScript, I engineered an interactive calendar interface that enabled users to input work hours, schedule vacations, attach contextual comments, and upload supporting visuals. The solution featured a team-specific configuration menu to streamline data entry and enforce role-based access controls.

Due to restricted access to Confluence’s backend database, I deployed a local server on a networked standalone workstation, utilizing Node.js and SQLite for data storage and retrieval. Asynchronous API calls (via AJAX) facilitated real-time data synchronization between the Confluence page and the server. The application achieved broad adoption, successfully centralizing scheduling visibility and coordination across teams.

While the solution operated effectively, its dependency on continuous server availability introduced scalability risks, ultimately necessitating migration to another calendar system. Despite this limitation, the project significantly advanced my technical expertise in JavaScript, API integration, and server management.