Log in
Log in

Tech News Blog

Connect with TECH NEWS to discover emerging trends, the latest IT news and events, and enjoy concrete examples of why Technology First is the best connected IT community in the region.

Subscribe to our newsletter

<< First  < Prev   1   2   3   4   5   ...   Next >  Last >> 
  • 06/01/2024 9:04 AM | Abby Pytosh (Administrator)

    The Industrial Internet of Things (IIoT) and adjacent technology continues to have a profound impact on industrial processes, creating opportunities for product and service transformation. From intuitive graphic interfaces and intelligent device sensors to full-scale industrial workcells and more, successful technology integration is proven to optimize operations. That said, here are several concepts for companies to consider: 

    Install a Robot 

    An outside-of-the-box concept for many to consider, the adoption of robotic technology further enhances the IIoT ecosystem, driving further growth toward operational excellence. Where applicable, companies that successfully integrate and synchronize highly efficient robots with their current network of devices, including capital equipment, have discovered benefits such as greater production workflow, part quality, product throughput, operational safety, and return on investment (ROI). 

    A prime example of this is the integration of an industrial or human-collaborative robot to load/unload machined parts. A highly mundane and potentially dangerous task (depending on the part and environment), the combined use of innovative end-of-arm tooling (EOAT), easy-to-use electrical interfaces and flexible-yet-robust robots enables highly consistent part transfer. Not only is this reliable and consistent method of transfer ideal for protecting the integrity of capital equipment such as press brakes, but also, employee health and safety concerns can be substantially minimized. As an added benefit, workers can be redeployed to safer, value-added tasks for increased competitive edge. Applications for palletizing, pick and place, and welding are other labor intensive tasks that are frequently automated. 

    Execute Security Standards 

    Whether an industrial robot or another piece of machinery has been deployed, companies that integrate and synchronize equipment to an IIoT framework should understand the potential cybersecurity threats and vulnerabilities these machines (and others for that matter) can face. From exploiting weak passwords to ransomware attacks and more, there are a variety of ways “bad actors” try to disrupt operations. For these reasons, it is important for decision makers to take all necessary steps to ensure robot and enterprise safety.  

    Adhering to robot safety standards and industry best practices such as the Robot Security Framework (RSF) is suggested. Additionally, replacing default passwords with strong passwords, along with backing up robot and peripheral data at regular intervals is helpful. 

    Encryption and authentication techniques to protect data and communication may also be helpful to securely connect robots to necessary systems or networks. Protocols like Secure Sockets Layer (SSL) and Transport Layer Security (TLS) can aid in encrypting and authenticating data transmitted over the internet or other networks, while Virtual Private Network (VPN) technology can create a secure and encrypted tunnel between devices and a remote server. Public Key Infrastructure (PKI) may also be used to employ digital certificates and keys to encrypt and authenticate data. 

    “Hardening” devices and existing networks to withstand physical and logical attacks is also important. This process is done by applying security measures and configurations that disable and/or remove any service and feature that is not required for robot or device operation. This includes apps, interfaces, ports, protocols, etc. 

    Clearly defining internal roles and responsibilities for managing the robotic system (and inter-connected devices), when needed, is also ideal. While this is not a complete list of protocols and methods that may be used, it indicates that cybersecurity is a real threat that should be taken seriously, and proper precautions should be implemented to protect operational integrity. 

    Use Machine Monitoring 

    Many devices (CNC machines, robots, grippers, scanners, torches, etc.) can provide a wealth of information pertaining to equipment performance and operational trends. The ability to check, harness, and transform this data into actionable insights is extremely valuable for achieving the highest level of operational efficiency – as it enables data-driven optimized planning for key decision making. 

    That said, the implementation of a factory automation monitoring system that supports multiple brand devices and collects data in real time is suggested. From a manufacturing perspective, IIoT monitoring tools (along with product/part tracking) are helpful for detecting system errors, part defects and production bottlenecks. 

    Proven edge server solutions that use a leading OPC-UA interface to enable an integrated, intelligent, and innovative approach to data analytics are ideal. This allows decision makers to see what is happening at any point on the value creation chain. In turn, this helps to make informed choices that provide the ability to better manage supply chain complexity, maintain high-throughput production and execute strategic company goals. 

    Practice Preventative Maintenance 

    The key to peak performance operations is maintaining the health of a robot and other capital equipment. While the use of machine monitoring for predictive and preventative maintenance can play a large part in the life cycle management of automated tools, visual checks of a robot system should not be overlooked. From performing a grease analysis to monitor iron levels to doing a manual test to check for worrisome vibrations and gear noise, there are common assessments end users should perform to protect their robotic investment. 

    With any high-end purchase, it is always smart to invest in the value-added support programs available through the equipment supplier. Not only does this help ensure maximum asset performance, but also it provides prime ROI. Locking into an annual or extended service plan can augment a company’s preventative maintenance strategy, while ensuring issues are addressed in a timely manner to optimize uptime. 

    Whether protecting a robot or robot system purchase, maintaining the life cycle of another piece of capital equipment, these concepts should help build a solid foundation. As always, any questions should be directed to your robot supplier or equipment manufacturer – as this will provide the best source of information for moving forward. 

    Bio: Bill Edwards is Sr. Manager of Collaborative Robotics at Yaskawa Motoman, where he strategically oversees all aspects of collaborative robot planning, design, specification and approval. With over three decades of experience in engineering and project management, as well as control systems application and design, Bill is dedicated to developing safe, high-quality robots that foster greater production efficiency. He is a voting member for both the International Organization for Standardization (ISO) and the American National Standards Institute (ANSI), where he serves on various industrial robot safety committees. 

  • 06/01/2024 9:02 AM | Abby Pytosh (Administrator)

    In today's rapidly evolving technology landscape, safeguarding your investment in technology is good business sense. With the ever-present threat of cyber-attacks to companies of all sizes and the need for scalability, businesses must adopt proactive measures to protect their assets while maintaining flexibility for growth. This article explores a concept called “continuous threat exposure management” and strategies to ensure scalability in the face of these evolving challenges.

    Continuous Threat Exposure Management

    1. Vulnerability Assessment: Scope for cybersecurity exposure
    2. Develop a discovery process for assets and their risk profiles
    3. Prioritize the threats most likely to be exploited
    4. Validate how attacks might work and how systems might react
    5. Mobilize people and processes

    Continuous Threat Exposure Management (CTEM) is a formal, proactive approach to identifying, assessing, and mitigating risks to an organization's digital assets. It involves continuously monitoring the organization's technology infrastructure, applications, networks, and data for vulnerabilities and potential threats. The goal of CTEM is to minimize the organization's exposure to cyber threats by identifying and addressing weaknesses before they can be exploited by attackers. The need for processes like CTEM in organizations of all sizes is an unfortunate reality of today’s world.

    According to Gartner, the 5 major steps in CTEM are:

    Vulnerability Assessment: Regularly scanning and assessing the organization's systems and networks to identify vulnerabilities, misconfigurations, and weaknesses that could be exploited by cyber threats.  

    The full vulnerability assessment process is an ongoing investigation of not only what ports might be accessible from the Internet, but also a complete scan of internal resources and what might be accessible to a “bad guy” if they do get inside the network.  In years past, it was often deemed sufficient to do a scan of your company’s “public” footprint to see what ports might be open to internal resources and identify any misconfigurations or flawed security from that perspective. More recently however, since over 90% of cyberattacks begin with a phishing email, which ends up either compromising a local system, or a cloud email platform, looking at the technology from the perspective of the bad guy is the better approach.

    Most organizations, especially smaller companies, may not have the internal resources or tools to conduct these types of scans internally. The use of an external 3rd party resource which specializes in Cyber Security Penetration Testing is advised. Even MSPs (Managed Service Providers) find it wise to outsource this specialized service to 3rd parties on behalf of their clients.

    Threat Intelligence Integration: Incorporating threat intelligence feeds from various sources to stay informed about emerging threats, attack techniques, and indicators of compromise relevant to the organization's industry and technology environment.

    Keeping up to date on everything technology related is a daunting process, and now we need to keep a close eye on Threat Intelligence as well. While there are many open-source, online resources that provide lots of up-to-date information on threats, keeping up to date on them is difficult, especially for small businesses.

    The FBI’s InfraGard program is a collaborative product between the FBI (Federal Bureau of Investigation) and members of the private sector. Authorized users of the InfraGard program can share information, networks and educational workshops to keep up on threats relevant to 16 specific infrastructure categories.

    There are also 3rd party resources providing consolidated resources for vulnerability and intelligence.

    Patch Management: Implementing a structured process for installing security patches and updates promptly to address known vulnerabilities in software, operating systems, and applications.

    It is common knowledge that Microsoft, as one of the predominant software providers, releases their standard patches on “Patch Tuesday” -- the second Tuesday of each month. Patch Tuesday is the unofficial term for the day when Microsoft releases update packages for the Windows operating system and other Microsoft software applications, including Microsoft Office. In some cases, Microsoft will issue "out-of-band" updates for particularly critical security flaws, especially ones that are being exploited in the wild.

    As Microsoft patches security vulnerabilities, it doesn't release those patches immediately. Instead, the company gathers those fixes into a larger update, which is released on Patch Tuesday.

    Windows workstations and servers automatically (by default) check for updates about once per day. The average system should automatically download these updates quickly but may delay installation.

    With the number of issues with Microsoft updates over the past several years, many organizations hold off on applying updates for a week or two, to make sure there are no issues noticed.

    Security Monitoring: Using monitoring tools and technologies to continuously monitor network traffic, system logs, and user activities for signs of suspicious or malicious behavior that could indicate a security threat.

    Network monitoring is crucial for small businesses to ensure the health and functionality of their computer networks. In today’s digital landscape, where businesses heavily rely on technology, having a robust network monitoring system is essential to find issues and potential threats. As small businesses often have limited IT (Information Technology) resources, it becomes even more vital to have efficient network monitoring in place.

    Incident Response Planning: Developing and regularly testing incident response plans to ensure the organization is prepared to detect, contain, and respond effectively to security incidents when they occur.

    Businesses should have a written plan that identifies those steps to take in an incident, including notifications to Cyber Insurance carriers, customers, and law enforcement. Preventive steps to keep business functionality include backup and recovery procedures to help a business recover and get back to normal operation as quickly as possible.

    Risk Prioritization and Remediation: Prioritizing vulnerabilities and security risks based on their severity, likelihood of exploitation, and potential impact on the organization's operations, and implementing proper remediation measures to mitigate these risks.

    The formal Continuous Threat Exposure Management (CTEM) process is an approach to identify, assess and mitigate risks to an organization's technology assets. While this approach is ideal in a perfect world, it does entail significant investments in processes and resources.

    For smaller business without the internal resources for this process, a Managed Service Provider may be able to provide these services or coordinate with 3rd parties for some of these steps such as Vulnerability Assessments, understanding threats, implementing Patch Management controls and Security Monitoring.

    Bio: Barry Hassler is the founder and President of Hassler Communication Systems Technology, Inc (HCST), a business IT Managed Services Provider based in Beavercreek OH. HCST has been in business since 1991 and serves a variety of small businesses primarily in the Dayton and Springfield Ohio

    Panetta, Kasey, “How to Manage Cybersecurity Threats, Not Episodes”, Gartner, 3 May 2024,

  • 06/01/2024 9:00 AM | Abby Pytosh (Administrator)

    Copiers (MFPs) and Printers are often forgotten when it comes to a company’s security policy.  As IT professionals continue to invest time and money into tightening their cybersecurity, it is vital to include MFPs and Printers in their policy.   Copiers of years ago were simple devices, not connected to a network and they only made copies.   As technology has advanced, so have MFPs.  These devices are just like any other device on your network and are a gateway to an enterprise’s most sensitive data.

    According to a 2023 Print Security Landscape Survey by Quocirca, 61% of organizations have experienced a print-related data loss over the past year and 39% struggle to keep up with printer security.  

    Below are the top MFP and Printer security risks and how to mitigate them:

    1. Physical Access: Documents left on the output tray pose a significant security risk.  We recommend a “follow you” print strategy to enable the device to “hold” the job until an end user authenticates and releases at the device.

    2. Default passwords: Factory preset credentials need to be changed by administrators.

    3. Lock Down Scanning Access: MFPs are often used more for scanning than copying.   Scanning should be locked down at the minimum to scan only to your company’s domain.  To take it a step further, having users authenticate with their user credentials and enabling “scan to myself” is a best practice. If scanning isn’t locked down, confidential documents can be sent to “Gmail or Yahoo” addresses and your company wouldn’t even know it was sent.

    4. Lack of Security updates: Your Vendor should work with you to ensure these devices have updated firmware and software patches.

    5. Hard drive wipes: Before these devices leave your environment to either be shipped back to a leasing company or decommissioned, it is vital to wipe or destroy the hard drive.

    6. Track and audit device usage: Many companies are investing in print management software to track who is using these devices and how they are using them.  You can limit what a user has access to and have audit trails of how each employee is using the device.

    7. Home office printing: With the trend of a hybrid workplace continuing to grow, it is vital to make sure that home office printers are configured and set up with the same security settings as your in-office devices are.

    8. Standard security features on devices: Many devices now come with SIEM integration, Verify System at Startup, SSD Data Encryption, and Encrypted Secure Print.  It is important to ensure that these security features are enabled and set up properly.

    9. Cloud printing risks: Cloud printing services are on the rise, but it is important that your provider has robust security measures in place.  End-to-end encryption of print jobs, reviewing activity logs and reviewing access control all need to be in place.

    MFPs and Printers are an integral piece of technology in organizations, but if left unmanaged can pose a high security risk.   Confidential data regularly moves between user’s PCs, Servers and MFPs/Printers, so it’s important to have a security plan in place to protect your print environment. 

    Bio: Leah Seymour is the Senior Sales Director for Modern Office Methods (MOM) and has 26 years of experience in the Office Equipment Industry.  She specializes in working with IT Leaders in Healthcare, Manufacturing, Logistics and Higher Education to help them improve productivity, control costs and secure their devices.

  • 06/01/2024 8:00 AM | Abby Pytosh (Administrator)

    Building a technology organization involves substantial investments – from hardware, software, and cloud solutions to the critical processes governing, modernizing, and maintaining operations, not to mention the invaluable talent driving these initiatives. 

    Your investment in people is particularly crucial, as skilled personnel play a pivotal role in developing, managing, and effectively utilizing technology. Identifying, recruiting, and onboarding the right talent requires significant time and resources. However, the journey doesn't end there. Retaining these team members necessitates a strategic plan and continued commitment of time and resources. 

    Here's how your business can safeguard its investment in people: 

    Leverage Technology First:

    • Attend peer group meetings for insightful peer-to-peer sharing and learning. 
    • Transform a peer group meeting into a one-on-one development opportunity by bringing a team member along. 
    • Expand knowledge and strengthen networks by attending conferences. 
    • Encourage subject matter experts (SMEs) to submit presentations at conferences, enhancing their speaking skills. 
    • Foster collaboration by actively participating in our peer resource group forums. 
    • Join a committee and contribute resources to give back to the community. 
    • Share volunteer opportunities with your team to promote team building. 
    • Nominate a team member for a Technology First Leadership Award and attend the event to support all finalists.

    Prioritize Well-being and Development: 

    • By prioritizing the well-being, growth, and development of your technology workforce, your business can ensure a resilient and motivated team that drives innovation in the dynamic technology landscape. 

    Connect, Strengthen, and Champion Your People: 

    • The more you invest in connecting, strengthening, and championing your people, the more invested they will be in the career you're developing together. 

    As you focus on protecting your technology investment, take a moment to enjoy summer activities that energize you. We hope to see you at one of our upcoming Tech First events! 

  • 04/25/2024 12:32 PM | Deleted user


    Stop it!  Stop it right now!


    We all know the classic Hollywood trope: the heroine stands her ground in front of the false threat”. In the face of her unexpected authority and steadfast agency, that threat suddenly comes up short (and to great comic effect!).  From there, what initially presents itself as an insurmountable obstacle to the heroine s journey transforms into a loyal and steadfast ally for the rest of the plot.  Think: Dorothy and the Cowardly Lion, or Cher and Nicolas Cage, or Kagome and Inuyasha. 

    If only life would work like that.  If only our work would work like that.

    There are a lot of lessons to be pulled out of these stories: team building, change management, system transformation, etc.  But there s a warning here, too.  A warning that gets glossed over too often and leads to a lot of failed team building, change initiatives, and system transformations.  The warning of the Magic Bullet Mentality - mistaking the resolution of the false threat” for the solution of the real problem.

    Everyone falls for the Magic Bullet at some time; most times, multiple times.  Do this one thing!” “Make this one change!” “Install this one tool!” Too often, especially in Information Technology, the Magic Bullet Mentality leads to buying and installing some very high-priced (and, therefore, high-visibility) tools that fail to live up to their promises.  And what do you have to show for it all?  A big hole in your budget, a burned out support team, and a lot of angry questions from your business leadership.

    So, if we are all susceptible to the trap, how can we avoid this ‘magic bullet’ thinking?  Even better, is there a way to rescue a deployment that obviously started with a ‘magic bullet’ in mind?  There are three questions to ask, right now, to help:

    1. Do you have a “Process Map”?

    Process mapping might not be the latest buzzword, but it’s darned important nonetheless.  And the sooner the process mapping is done around the existing tool, the better.  That’s right; the existing tool.

    Too often we’re so enamored with the promised benefits of the new tool, we want to skip to the end and not do the due diligence necessary to map out the work ahead.  But how else can one ensure new tooling will successfully take over the job of the old one, if you can’t describe the work the old one was doing?  You can’t.  You need a baseline to measure against.

    The Process Map as a baseline has another benefit.  Once completed, a savvy project manager can take it, break it down into the individual components and efforts, and build a much more effective project plan.  And a project manager with a proper project plan will be much more likely to succeed.  Already started the project and don’t have the Process Map completed? 

    *SMACK* “Shame on you!” - Dorothy Gale, Wizard of Oz (1939)

    2. Do you have a “Data Map”?

    Data mapping is a newer buzzword, but it is not the same as a Process Map.  A Process Map charts the relationships between business functions and processes.  It provides insight on: which departments are responsible and accountable for what activity, who gets informed or consulted about those results, and what needs to happen before and after.  A Data Map describes the flow of information between the various systems and technologies described in the Process Map, down to the individual data attributes. 

    Consider an example.  Three teams work together to procure hardware and software assets: Vendor Management, Purchasing, and IT Service Management.  Each uses different tools to do their jobs and coordinate using email, chat, and excel charts.  The Process Map will show how these teams work together, and how a new ERP system could provide response and operational efficiencies.  The Data Map would show that the Service Management team needs cost center codes and project number data in order to keep break/fix inventory separate from capital expensed equipment.  And, come to find out, that new ERP system requires a whole separate financial module to be able to handle that demand.

    Let me guess.  You already started the project without a data map? 

    *SMACK* Snap out of it! - Loretta Castorini, Moonstruck (1987)

    3. Have you done a “Feature Analysis”?

    To be fair, the issue exemplified above could also have been noticed with a proper Feature Analysis.  Nobody likes doing them, but the money and heartache that can be avoided by making sure the right tool is chosen cannot be stressed enough!  A Feature Analysis goes beyond merely sitting through a boring sales demo, or copy/pasting the feature comparison from .  The Feature Analysis lays out the manufacturer’s plans and roadmaps for future features, prices out the cost of support services and discounts offered, interviews and investigates other customers’ satisfaction with the tool and implementation.  The Feature Analysis will also draw direct connections between issues internal clients are having with their existing tooling and the features offered by all the other tools being considered.

    Unfortunately, a Feature Analysis will not be of much good if the tool has already been purchased.  But if you are still in the planning stage?

    *SMACK* “Sit boy!” - Kagome, Inuyasha (2000)

    BONUS QUESTION: Consider a third-party consultant?

    The idea of bringing in a third-party consultant - in addition to the internal technical personnel, the manufacturer’s technical resources, project manager, stakeholders, etc. - gives the feeling of ‘too many cooks in the kitchen.’  However, if the installation project is already in flight and there is a high degree of “magic bullet” thinking, a third-party consultant can be just the solution. 

    The manufacturer’s provided resources are really there for one goal: to complete this installation to the client’s satisfaction (or contractual obligation).  Usually, they will be running a script, checking boxes, and only provide specific solutions to specific technical challenges that arise.  They will not be able to make suggestions surrounding process changes, data flows, internal responsibilities or external accountabilities.  The internal team might know the ins-and-outs of the current platform, policies, and users.  But they might not have the experience with large digital transformations and the challenges the new tooling will present.

    A third-party, independent consultant can provide the experience and guidance to help both the internal and external teams succeed.  An experienced consultant will have actively participated in a number of similar deployments (not just the tool in question, but competitors as well).  They will be able to describe the usual mistakes and provide recommendations to the internal team and increase their change of success.  At the same time, they can engage the tooling resources to effectively and efficiently ensure the tooling resources can overcome challenges in a way that ensures long-term success for the client (not just the installation).  A third-party, independent consultant can also provide benefits in other areas as well: completing documentation, crafting training materials, building internal reporting to ensure the new tool stays useful.

    Bio: Jeremy Boerger founded Boerger Consulting, to improve and promote the benefits of Information Technology Asset Management (ITAM) programs for medium and large businesses.  He is a published author and sought-after speaker at technology expos, conventions, and podcasts around the world.  His clients routinely see a one-third reduction in their software operations budget, usually within nine- to twelve-month timeframe.

  • 04/25/2024 12:19 PM | Deleted user

    In the tech leadership world, there’s this common trap we can fall into—hiding away in our offices. It's easy to do, especially for those of us who lean a bit on the introverted side. But here’s the thing: to really get the pulse of what’s happening in our field, we've got to get out there and engage with the people doing the actual work. Technology’s all about making things easier and more innovative for our teams, but there’s often a gap. Folks not living and breathing tech might not know about all the cool tools they could use, and us tech leaders might not see the creative shortcuts or workarounds they're using to get things done.

    So, how do we bridge this gap? How do we get more in tune with the heartbeat of our companies? Here are a few thoughts:

    • Chat it up in the hallways: Seriously, just talk to people. Get to know the movers and shakers in your company. You’ve got to build some trust, so they'll open up about their daily grind. When you’re visiting different sites or meeting with customers, leave some room in your schedule for those chance hallway chats that can lead to gold. Find out what big problems they’re trying to crack and maybe, you can throw in your two cents about a project that might help them out.
    • Mix your teams: Have your tech folks spend some time shadowing the end-users. It’s eye-opening to see if the tools you’ve deployed are actually making life easier for them or if they're just sitting there gathering virtual dust.
    • Know your stuff: This one’s a given but keep digging deep into what your business systems can do. The more you know, the better you can jump into conversations with something useful to contribute.
    • Circle back on solutions: After you roll out a new tool or system, don’t just walk away. Check back in after a month or two to see if it’s really making a difference or if it needs a tweak here and there.

    Getting out of your office and engaging with your team not only helps your tech team get seen as the go-to problem solvers but also makes your work a lot more enjoyable. It’s about building connections, understanding the real-world application of your work, and continuously improving things.

    In short, stepping away from the comfort zone of your office and diving into the daily lives of those around you is key for any tech leader looking to make a real impact. It’s all about fostering a sense of community, being open to learning from the ground up, and using that insight to drive innovation and efficiency. Plus, it’s just a more fun and fulfilling way to work. Making those connections, understanding challenges first-hand, and seeing the direct impact of your solutions.


    Bio: Karen Kauffman is the Director of Information Technology at Precision Strip Inc.  Karen has 35 years of experience in the IT industry, with 20 years of leadership experience.  Precision Strip leads the industry in metal processing for the automotive, appliance and beverage can industries.

  • 04/25/2024 11:18 AM | Deleted user

    If we were to view technological growth through a wide-angled lens, it becomes evident that advancements are significantly outpacing governance. This rapid progress presents tech leaders with a formidable challenge: How do you safely deploy Generative AI while maintaining data security? As a transformative force, Generative AI rockets innovation forward. Yet, from a data governance perspective, these AI tools also introduce great risks such as unintentionally birthing an amplified insider threat. This dichotomy makes vigilant oversight crucial. This article highlights the dual nature of Generative AI and proposes essential steps to harness its potential while ensuring a safer deployment.

    The Double-Edged Sword of Generative AI: Innovation and Risk

    Generative AI significantly bolsters one technological capability, serving as a uniquely clever assistant that streamlines creative processes and enhances data analysis. These advancements don't just open the door to a wide variety of innovative opportunities—they also add a layer of complexity to the management and oversight of these potent productivity enhancers. Without proper checks, the very features you value can quickly morph into formidable threats, leading to security vulnerabilities, misinformation campaigns, and unforeseen ethical dilemmas.

    This nuanced challenge to privacy is enhanced by insider threats, which often stem from either human error or malicious intent. Sometimes, malicious attempts to compromise an organization can be inadvertently thwarted due to the perpetrator's sheer ignorance of what critical data they can access. However, in today's reality where data sprawl is rampant and permissions management becomes increasingly complex, Generative AI could unintentionally facilitate privacy breaches. Operating autonomously, it may grant unauthorized access or usage of sensitive information, escalating the risk of data breaches. To go a step further, as AI tools become more sophisticated and adapt within these lax ecosystems, they might inadvertently widen the impact of insider threats, accessing critical data that would otherwise remain undiscovered. Navigating these complexities requires tech leaders to practice stringent supervision of data stores, ensuring AI is employed within strict parameters to uphold privacy and safeguard data integrity.

    Common Pitfalls of Unchecked Generative AI

    The pitfalls of unchecked Generative AI are as varied as they are significant. Take, for instance, the case where a seemingly innocuous mistake by an executive assistant—sharing an executive folder via a collaboration link—became an open vault for anyone within the organization. This misstep led to a critical breach when a departing, disgruntled employee exploited the oversight. The worker harnessed the capabilities of a Generative AI Assistant, Microsoft Copilot, to ingeniously request and obtain sensitive data including executive compensation details, family trust documents, and personally identifiable information (PII) of the organization's leaders.

    Such incidents underline the multifaceted nature of threats posed by unregulated Generative AI. They are not confined to the direct actions of the AI itself, but also encompass how it can be maneuvered to fulfill harmful intentions—especially when combined with human ingenuity and malcontent. This unpredictability of AI-assisted threats adds an intricate layer to the already complex challenge of safeguarding against data breaches, emphasizing the need for rigorous AI governance protocols within any enterprise.

    Strategic Steps to Mitigate Risks

    To effectively manage Generative AI and maintain security integrity, technology leaders should consider implementing the following strategic measures:

    1. Adopt a Least Privilege Model:

    • Restrict access to data strictly on a need-to-know basis.
    • Regularly review and adjust permissions to minimize unnecessary access.

     2. Establish Robust Ethical Guidelines:

    • Draft and enforce clear policies on AI's decision-making boundaries.
    • Create protocols for swift intervention if AI actions deviate from norms.

     3. Deploy AI Monitoring Tools:

    • Implement systems for real-time oversight of AI activities.
    • Ensure traceability of AI actions to their sources for accountability.

     4. Integrate AI Risk Assessments:

    • Incorporate AI threat evaluations into existing security frameworks.
    • Develop proactive strategies to respond to anticipated AI vulnerabilities.

     5. Educate and Train Staff:

    • Conduct regular training sessions on the benefits and risks of Generative AI.
    • Promote a company-wide culture of AI awareness and responsible use.

    These actionable steps provide a comprehensive framework for mitigating the risks associated with Generative AI, promoting ethical use, and fostering an environment of informed vigilance.

    Charting a Safer Course: The Imperative of AI Governance

    We truly do stand at a crossroads of innovation and responsibility. It’s clear that unchecked use of Generative AI tools presents formidable risks alongside their tremendous upside for productivity. The incidents and challenges we've discussed underscore how essential it is to take a proactive risk management approach when deploying Generative AI in your ecosystem.

    To ensure the ethical deployment of AI, it is imperative to establish and uphold stringent ethical guidelines. These guidelines act as the compass that guides AI behavior, ensuring that it aligns with your values and ethical standards. Additionally, investing in comprehensive education and training programs is not just about risk avoidance but also about empowering your team to leverage AI responsibly and effectively.

    By reinforcing these two pillars—ethical guidelines and continuous learning—you cultivate a knowledgeable and principled workforce. Strengthen your AI oversight and educate your teams on how to safely utilize AI tools. By taking these decisive steps, you will not only protect your organization but also position it to thrive, deploying AI with precision and security.


    Bio: Ramone Kenney brings over 13 years of expertise in providing technology solutions to complex challenges. As Manager of Enterprise Accounts, specializing in Cyber Security for Varonis, he is instrumental in overseeing the deployment and implementation of robust data security measures. Ramone is committed to leading the charge in risk reduction initiatives, ensuring that his clients are safeguarded against ever-evolving digital threats.

  • 03/27/2024 10:48 AM | Deleted user

    Embracing AI in Leadership

    In November 2023, I presented to and spoke with a group of technologists on the topic of “Embracing AI in Leadership.” The mission was to open people’s eyes to the possibilities of enhancing their own growth potential in their leadership journey by utilizing Generative AI. This included real case studies, several Gen AI tools, and various prompting techniques. Where there were a few skeptics at the beginning of the presentation, in the end, they were converted into believers. Task completed? Well, I think this journey is and will be a work in progress for the foreseeable future.

    My decision to delve into this topic was driven by multiple inspirations. First among them was witnessing the widening divide between the technologically empowered and the underserved. My journey from blue-collar sectors to the forefront of the tech industry has confirmed and validated the idea that there are boundless opportunities that technology can offer. Yet, it has also highlighted a stark reality: as technology progresses, the chasm deepens between those who continuously evolve with technology and those who remain disconnected. This disparity, especially with the advent of easily accessible Generative AI, is set to expand even more rapidly, distinguishing between the adopters of AI and those left behind.

    Setting AI aside for the moment, Scott Klososky points out that we have an existing problem. Even amongst well-established organizations, there are digital skills gaps that incur massive hidden costs. I have witnessed this myself at multiple organizations over the past several years. For instance, many individuals remain reliant on their mapped network drives for accessing and managing shared content. Mapped network drives have been around for a very long time. At the risk of opening myself up to rebuke, I’ll go ahead and say it, this is old-school technology, pre-cloud, pre-OneDrive, and pre-modern-tech.

    On one hand, some IT departments have invested heavily to ensure access to their mapped drives and shared content is seamless. On the other hand, many users experience frustration with the less-than-optimal method of working with their content, especially when on the move. What makes this worse is these same organizations have also invested in Microsoft 365 (M365), a comprehensive digital workplace. Yet 80% or more of users are still locked in their old-school processes that revolve around their mapped drives (these are my unofficial stats).

    This scenario underscores a missed opportunity to leverage existing resources more effectively and highlights the pressing need to address the digital skills gap within the organization. As Scott points out, the organization is responsible for closing the digital skills gap or suffer the resulting cost of a lack of efficiency and effectiveness.

    Get Your Digital House in Order

    Switching back to the AI topic. If your organization already has a detrimental digital skills gap, you have some work to do as a technology leader. Embracing AI in Leadership is based on the idea that, conceptually, you can use generative AI to improve your leadership capabilities, including strategic thinking, future-scenario simulations, enhanced research capabilities, and potentially help you devise a strategy and plan for closing the digital skills gap in your organization. I took a few minutes working with ChatGPT to generate the following high-level strategy as a good starting point to consider.

    Strategy Component

    Action Items

    Expected Outcomes

    Baseline Digital Skills Assessment

    - Assess to identify skills gaps.
    - Benchmark against industry.

    - Clear understanding of skills gaps.
    - Identified improvement areas.

    Customized Learning Pathways

    - Develop role-specific pathways.
    - Utilize AI for personalized learning.

    - Role-relevant skills improvement.
    - Higher engagement.

    Leverage Existing Technologies

    - Train with Microsoft 365.
    - Workshops for new tech adoption.

    - Improved tool utilization.
    - Reduced outdated tech reliance.

    Promote a Culture of Continuous Learning

    - Reward learning achievements.
    - Foster knowledge sharing and mentoring.

    - Learning as a core value.
    - Supportive learning environment.

    Generative AI in Leadership Development

    - Integrate AI for planning and simulations.
    - AI training for leaders.

    - Enhanced leadership skills.
    - Better planning capabilities.

    Continuous Feedback and Adaptation

    - Feedback loops for learning effectiveness.
    - Update materials based on feedback.

    - Agile adaptation to tech changes.
    - Skills aligned with goals.

    Beyond the Skills Gap: Paving the Way for AI and Digital Innovation

    This strategy lays the foundation for AI adoption and other technological advancements, while addressing the digital skills gap is crucial, it’s just one aspect. Organizations are embarking on extensive digital transformation, adopting advanced ERP systems, shifting to cloud-based development, enhancing cybersecurity, and establishing robust disaster recovery plans. Amidst these transformations, businesses strive to excel, balancing multiple initiatives seamlessly in their dynamic operations.

    As a technology leader, prioritize delivering high-quality digital transformations, using AI judiciously to augment your leadership and ensure initiatives remain focused and effective, without diluting quality or mission clarity. At some point, your organization might consider tackling AI head-on. For example, evaluating viable use cases, considering deploying your own Large Language Model (LLM) internally, training or fine-tuning your models using your own proprietary data, and then figuring out how to production-ize AI solutions. As you prepare for this new initiative, find the right partner and hire or train the right leaders who can devote 100% of their effort to AI. You will need leadership, experience, and a team to tackle this new space.

    In Conclusion

    As we explore “Embracing AI in Leadership” and tackle the digital skills gap alongside our digital transformation efforts, the journey is both challenging and filled with opportunities. Fostering a culture of continuous learning and adaptability is key to not just bridging the digital divide but thriving in the era of rapid technological advancement. Our ongoing endeavor is to leverage technology to enhance our leadership, improve operations, and drive organizational growth in a world where technology continuously reshapes our landscape.

    Bio: Kalen Howell has a master's degree in computer science from Franklin University and an MBA from the University of Dayton and has worked in the software development industry for over 20 years. Today, as CIO, Kalen leads a technology team, IT OPS, and Software Development, with huge digital initiatives rolling out to organizations across the US, Canada, and Mexico.

  • 03/27/2024 10:44 AM | Deleted user

    My youngest son has no fear of technology and, until recently, no programming experience. That dramatically changed when his logistics professor asked his business school students to build an interactive digital map of the world depicting the sourcing, manufacturing, shipping, and receiving patterns for a fictitious, complex supply chain case study.

    My son, in a few days, used an Excel data file and OpenAI’s GPT-4 to create and edit code in RStudio and build the interactive model. This exemplifies the democratization of skills previously accessible only to a subset of the population.

    As a result, we are going to see an explosion of innovation as creative people around the globe tap into this power.

    Applying AI to Solve Healthcare Problems

    In healthcare, we've leveraged algorithm-based machine learning for years. Our primary applications in imaging studies have quickly identified tissue abnormalities, genetic assessments to understand the future risk of cancer, and predictive models to catch infections early so they can be treated before becoming life-threatening.

    Generative AI is going to positively impact healthcare, and Kettering Health will see initial benefits in three areas: 

    • Provider workload: Chart summaries can be automatically prepared to share with the patient or referring provider. Inbox assistants help the clinician quickly reply to patient questions.
    • Rare disease management: Analytics assistants will help clinicians find other patients with rare symptoms like the ones they are treating and help the clinician connect with that care team so a treatment plan can be developed.
    • Revenue cycle efficiency: Existing automation opportunities will be enhanced to increase the efficiency of burdensome diagnostic coding and billing processes.

    Most healthcare organizations don't have the deep pockets needed to invest in this technology to be on the cutting edge. But the good news is that Epic (our electronic healthcare record systems partner) and Microsoft are making huge investments in generative AI.

    Because of that, we can partner with our providers to prototype features in the three areas I listed to determine which are the most valuable, integrate them into their workflows, and then continue to learn as more practical applications are developed.

    Our first serious venture into generative AI is to use it to assist our physicians to select the appropriate diagnosis codes while charting, saving time and increasing accuracy. We are piloting a solution in one of our hospitals—with plans to scale throughout our system’s acute locations.

    It’s More Than Hype

    This technology is exciting, and no one is completely sure of its future benefits. We’ve seen careless accelerated approaches to adopting AI lead many organizations, across all industries, into the troubled waters of IP law, copyright infringement, plagiarism, and even moral calamity.

    However, as my son’s experience illustrates, the potential value is real.

    I look forward to when more of us adopt these skills and move beyond generative AI applications that save time or reduce the burden of repetitive tasks to truly life-enhancing innovations for our community.

    Bio: Eric Crouch serves as the Chief Information Officer (CIO) and Vice President for Information Services at Kettering Health. He oversees the strategic planning, implementation, and management of information technology and digital transformation initiatives. The people at Kettering Health serve in medical centers and outpatient locations throughout western Ohio and are dedicated to elevating the health, healing, and hope of the community.

  • 03/27/2024 10:36 AM | Deleted user

    Quantum computing is a massive leap forward in computational capabilities potentially revolutionizing whole industries and solving previously intractable problems. Quantum computing relies on highly specialized technology (hardware and algorithms) that takes advantage of quantum mechanics to process information in ways that would have been considered science fiction not long ago. Quantum computers can solve extraordinarily complex problems that even supercomputers can’t solve in a reasonable amount of time, like before the sun burns out. Let’s take a deeper look at quantum computing.

    What is quantum computing?

    Quantum computers use quantum bits, or qubits, as basic informational units. Unlike traditional bits which can represent either 0 or 1, a qubit can exist in a state of 0, 1, or both at the same time thanks to the quantum phenomenon known as superposition. This allows quantum computers to explore multiple possibilities simultaneously, and in conjunction with other quantum phenomena such as entanglement and interference, solve highly complex problems much faster than traditional computers.

    The quantum theory of physics was born in 1900 when physicist Max Planck published his study on the effect of radiation on a “blackbody” substance. As a subset of quantum physics, quantum computing is a relatively new field. It began to take off in 1994 when Peter Shor developed a quantum algorithm for factoring integers with the potential to crack public-key encryption schemes such as RSA or ECC. Astounding progress has been made in the decades since, major milestones include:

    • 1995: David Deutsch and Richard Jozsa demonstrate the first quantum algorithm that can outperform any traditional algorithm on a specific problem.
    • 1996: Lov Grover invents a quantum search algorithm that sports a quadratic performance improvement over traditional algorithms.
    • 2001: IBM builds the first quantum computer to execute Shor's algorithm.
    • 2007: D-Wave Systems demonstrated the first commercial quantum computer.
    • 2019: Google claims quantum supremacy by performing a random circuit sampling in 200 seconds versus 10,000 years on a state-of-the-art supercomputer.
    • 2020: IBM announces quantum advantage by performing a financial portfolio optimization task in 2 minutes versus 6 hours on a state-of-the-art supercomputer.
    • 2020: China achieves quantum supremacy by performing a Gaussian boson sampling task in 200 seconds versus 2.5 billion years on a state-of-the-art supercomputer.

    2023: IBM announces plans to build a 100,000-qubit machine that will work alongside traditional supercomputers to achieve breakthroughs in drug discovery and other advanced applications.

    Quantum computing technology continues to advance rapidly. Industry leaders such as IBM, Google, and D-Wave, as well as academic institutions and startups continue to push the boundaries of what's possible. Currently, most work in the field is focused on reducing error rates, developing error correcting algorithms, increasing the number of qubits, and improving qubit coherence times which is length of time a qubit can maintain quantum state.

    Common Benefits of Quantum Computing

    The advent of quantum computing brings a host of benefits including:

    • Speed: Quantum computers can solve certain problems exponentially faster than traditional computers. Shor's algorithm could factor a 2,048-bit number in about 10 minutes, versus billions of years on a traditional supercomputer. It’s worth noting that this has raised strong concerns about the future of encryption.
    • Efficiency: Quantum computers can solve certain complex problems with fewer resources than traditional computers.
    • Power: Quantum computers can manage more complex problems, even problems that were previously unsolvable.
    • Security: Quantum computing introduces new paradigms in encryption.
    • Innovation: Quantum computing enables breakthroughs by handling extraordinarily complex simulations and calculations that are impractical if not impossible for traditional computers.

    Common Challenges of Quantum Computing

    Quantum computing is not without challenges, including:

    • Hardware: Quantum computers are complex, expensive and difficult to build and operate. They currently require extreme cooling and specialized equipment to maintain the quantum states of qubits, making them impractical for widespread use.
    • Software: Quantum computing requires new programming languages, algorithms, and tools.
    • Threats: Quantum computing poses a significant threat to current cryptographic standards putting vast amounts of sensitive data at risk.
    • Technical Challenges: Quantum computing is an immature and rapidly advancing field. Error rates and qubit coherence must be overcome to fully realize quantum computing’s benefits.
    • Ethics: Quantum computing poses profound social and ethical implications. It creates the potential for new forms of power and influence, as well as new forms of harm, inequality, and conflict. For additional insights, check out this article from Deloitte Quantum computing may create ethical risks for businesses. It’s time to prepare”.

    How Quantum Computing Is Currently Used

    • Despite challenges and concerns, quantum computing has the potential to revolutionize industries including artificial intelligence, physics, medicine, chemistry, logistics, finance, cryptography and more. Current applications for quantum computing include:
    • Advanced simulations: Quantum computers could simulate the behavior of molecules, atoms, and subatomic particles leading to breakthroughs in the design of new materials, drugs, and energy sources.
    • Optimization: Quantum computers could tackle complex optimization problems that require finding the best solution from a vast number of possibilities. Examples include scheduling, routing, and planning.
    • Accelerating machine learning: Quantum computers could enhance the performance and capabilities of machine learning models through faster training, better accuracy, and less complexity.
    • Cracking encryption: Quantum algorithms like Shor’s pose a significant threat to data secured by current encryption schemes. Cybersecurity experts have speculated that adversarial nation state actors (APTs) are collecting as much data as possible now knowing that quantum computing will soon make it feasible to crack current encryption schemes and unlock vast troves of stolen data.
    • Improving encryption: Quantum cryptography promises strong encryption that will be virtually unbreakable. NIST has announced four post-quantum resistant encryption algorithms designed to withstand quantum attacks.
    Conclusion: Facing Our Quantum Future

    Quantum computing is a new paradigm that transcends the limits of traditional computing by harnessing the power of quantum physics to perform calculations that are impractical if not impossible for traditional binary based computers. While still in its developmental stages, quantum computing shows the potential to radically transform our world.

    For technology professionals, quantum computing presents unprecedented opportunities and challenges. Understanding its principles, applications, benefits, and risks will allow us to play a pivotal role in shaping this technology, addressing ethical concerns raised by it, and developing secure, equitable access to quantum technologies.

    The journey into the quantum future is just beginning, and its impact on society, science and the economy will be profound. It’s an exciting time to be alive and to be in tech as we now have the opportunity to shape this future and ensure the power of quantum computing achieves its most beneficial potential.

    Bio: Dave Hatter – CISSP, CISA, CISM, CCSP, CSSLP, PMP, ITIL, is a cybersecurity consultant at Intrust IT. Dave has more than 30 years’ experience in technology as a software engineer and cybersecurity consultant and has served as an adjunct professor at Cincinnati State for nearly 20 years. He is a privacy advocate and an Advisory Board member of the Plunk Foundation. Follow Dave on X (@DaveHatter) for timely and helpful technology news and tips.

<< First  < Prev   1   2   3   4   5   ...   Next >  Last >> 

Meet Our Partners

Our Cornerstone Partners share a common goal: to connect, strengthen, and champion the technology community in our region. A Technology First Partner is an elite member leading the support, development, and expansion of Technology First services. In return, Partners improve community visibility and increase their revenue. Make a difference in our region and your business. 

Become A Partner

Cornerstone Partners

1435 Cincinnati St, Ste 300, Dayton Ohio 45417

Cancellation Policy | Event Terms and Conditions | Privacy Statement