Exclusive! The Power of Datafication of Everything for GCC 2.0   – OpenGov Asia | #government | #hacking | #cyberattack


Singapore’s government announced a  five-year plan to migrate all its information technology (IT) systems from on-premises infrastructure to a commercial cloud platform to speed up service delivery and improve services for citizens and businesses. According to GovTech, to date, the government has close to 600 systems on cloud and is on track to have 70% of eligible systems on the cloud by 2023.

With the formation of the Government on Commercial Cloud (GCC), the initiative was launched to standardise the onboarding experience and administrative tasks of government agencies on cloud, such as workload administration, account and billing management, secure access, and compliance.

After the launch of GCC 1.0, as a “wrapper” platform, government agencies are looking  forward  to  the  iterative  enhancements  for  the  GCC  2.0  platform  vision  which includes:

  • Customer centricity
  • Automation focus
  • Improved observability and auditability
  • Compliance and security by design
  • API first
  • Native solutions over builds

GCC 2.0 will include a single identity, endpoint posture checks, access control, native workload administration, controls, and compliance. With this upgrade, many areas such as migration, protection, and security, will be fully focused on. It is also critical to modernising data backup and recovery options to aid in migration protection.

Understandably,  most ministries have not yet implemented  GCC  2.0,  but it is critical to be prepared for these significant upgrades and ensure backup and recovery are in place to ensure no disruption in services provided to citizens.

This practical concept has proven to be effective and is expected to deliver as all ministries prepare for GCC 2.0. It is critical to strategically plan the migration to ensure that data is fully backed up and automated.

To guarantee data is always recoverable and available in the event of outages,  attacks, loss or cyber threats, ministries must protect all workloads with backups, supplemented with snapshots and replication where appropriate. This function enhances data governance practices, thereby increasing citizen and investor trust.

Moving forward, the emphasis will be on security controls based on identity access management and policy-as-code, such as real-time configuration checks. This will allow GCC 2.0 to rely less on cloud management portals for better control and automation of service requests.

The Public Sector Day held on 14 April 2022 at Singapore Marriott Tang Plaza Hotel aimed at imparting knowledge on the steps the Singapore public sector needs to focus on for data migration, automation,  protection and recovery while upholding security and compliance in the transition towards GCC 2.0 vision.

Embracing technology to elevate citizen experiences

Mohit Sagar: Adapting strategy to shifting culture

Kicking off the session, Mohit Sagar, Group Managing Director & Editor-in-Chief, OpenGov Asia, acknowledges that the pandemic brought significant changes in culture and perspective.

“In the new normal, habits have changed,” Mohit opines. “Data is more important than ever. Data needs to be available, but it has to be safe.”

Yet organisations need to take a serious look at how cloud is implemented. In Singapore, the government has worked to ensure that organisations have elasticity, security, and interoperability.

He believes that the focus of organisations is on predicting while keeping the data safe and accessible. Besides that, he asserts that citizen experience is very important. No one will stand in line for government services.

The New Zealand government advised people not to try and build everything in-house. Instead, organisations should use the technology that is outside to help them get to their destinations faster.

Closing his address, Mohit strongly recommends organisations look for specialists to partner with. “Let the experts do what they do best,” Mohit urges. “It not only allows the best systems and infrastructure to be put in place but also frees up the organisational workforce to focus on driving growth.”

Data strategies to power a digital government

Raymond Goh: Data-defined strategy that drives digital government

Raymond Goh, Senior Director, Systems Engineering, Asian & Japan, Veeam spoke next on the nuts and bolts of devising a data strategy in the public sector.

“How are we embracing technology? Are you an optimist or a pessimist?” Raymond asks.

According to Raymond, there will always be pressure regardless of whether one is an optimist or pessimist. Data is exploding and organisations are running out of capacity to store data. There are several implications of that:

  1. De-duplication and compression to redress the capacity gap
  2. Feasibility of media technology like object storage architecture
  3. Intelligent data management that increases efficiency and utilisation

While data management is needed, Raymond acknowledges the challenges that organisations face including manual data classification from different inputs and applying it to compliance, disaster recovery, security, or archive strategy. Yet there is hope in that there is cognitive computing to sort, tag, place and automate data movement.

Sharing some of the use cases of data management systems, Raymond highlights the use case in the business continuity plan. Some of the key benefits are as follows:

  • Backup Data classification from tier-based archiving to cloud and/or tape to cater for ZB data growth
  • Disaster Recovery data classification together with infrastructure resources to combat cyberthreat or data loss
  • Using criticality and gap assessments to ensure governance and compliance

Raymond also emphasises that it is a gradual process towards a hybrid cloud model and not “a big bang” approach. He shared how Veeam helps organisations with digital transformation.

For Veeam, there are 5 stages of intelligent data management:

  1. Backup

Protect all workloads using backups, complemented by snapshots and replication where appropriate, to ensure they are always recoverable and available in the event of outages, attacks, loss or theft.

  1. Cloud mobility

Provides easy portability and fast recovery of ANY on-premises or cloud-based workloads to Amazon AWS to maintain business continuity and Availability across hybrid cloud environments.

  1. Visibility

View the full breadth of your data, accompanied by the infrastructure that it passes through and resides on so that you can pivot from reactive to proactive management for better business decisions.

  1. Orchestration

Optimise data utilisation across multi-cloud environments with workflows that ensure consistent execution of otherwise manual and complex backup, recovery, and data management tasks.

  1. Automation

Data becomes self-managing by learning to protect itself with appropriate SLAs (Singapore Land Authority), methods, and locations to meet business objectives or comply with broader IT (Information Technology) initiatives.

From his experience, he concludes that utilising Veeam offers agencies a better data management system that can allow government agencies to provide better and faster services to citizens. Overall, it enables agencies the ability to protect, manage and unleash data.

Strategising and implementing data governance

Andrew Bell: Operationalising data governance

Andrew Bell​, Partner Segment Lead-Storage, APJ, Amazon Web Services elaborated on how organisations can implement data governance.

Opening his presentation, Andrew shares that the prevailing challenge lies in the explosion of data. There is a need to create a balancing act between access and control​. However, how data is accessed as individuals, employees or citizens are changing. Concurring with Mohit’s point that people no longer want to stand in line at a bank or in a government department, Andrew feels that this trend will not slow down.

“Data is a business asset, therefore more people are working with data than ever before,” Andrew claims. According to him, organisations need to:

  • accelerate their innovation initiatives through the use of data analytics, IoT, ML, and others​
  • securely share and access data assets – Access only to the users who need it and only when they need it​
  • ensure data security and compliance – policies and technical solutions to ensure data privacy  ​
  • monitor data trends and data behaviour to drive the correct data-driven business decision ​
  • develop data skills and approaches​
  • reduce their data costs ​
  • meet business demands through higher data quality​

He believes that while organisations can buy different technologies to do it, all of it comes down to governance – to the policies and frameworks that we are put in place – around data, around protecting it, around accessing it and around storing it. Only when people have clarity on where data comes from and have the policies in place, can they trust that data. Once people trust the data, they can modernise the data and get value out of it.

Credits: Andrew Bell

Data governance is foundational to everything that organisations are trying to achieve. Beyond monetisation or transformation, it provides that foundation to do a lot more. IT is not just a cost centre, IT will be driving the business forward and making sure that the data is available.

According to Andrew, challenges still exist, as agencies will need to modernise to align with the GDA and Singapore’s Data Governance and Protection Framework.

Agencies need to ensure they only collect data of value that is not in the SSOT – and be agile as SSOTs evolve and become available.

Data quality is critical to any decision making – systems need to be in place to ensure its accuracy & validity​

Data users and data owners need to be accountable. Data governance is everybody’s job and needs to be embedded into the culture of the agency.

Automation is critical.

Andrew shared the implementation process. Some ways AWS can help include assessing governance rules, organisation and operating model, compliance, business alignment, technology, and recommendations.

Andrew concludes that data governance and protection are highly critical. The only way to achieve this safely is through automation. Without innovation, organisations are hamstrung and will not be able to innovate.

Navigating transitions to cloud

Jon Lau: Taking on the challenges of data migration to the cloud

Jon Lau, Director – Scientific IT Wing, ITSS and Chief Information Security Officer, A*STAR talked about the challenge of data migration to the cloud.

Jon shared that while A*STAR is a government agency, they comprise many researchers. A*STAR does have government systems but also provides IT systems for the researchers to use their own internal policies as well as government policies

In 2018, Singapore laid out a five-year plan to migrate 70% of its less sensitive government IT systems from on-premises infrastructure to the commercial cloud. A statement from GovTech last week said that close to 600 systems had been migrated to date.​

Jon shared that they started moving quite a few of their corporate systems to the cloud with that new mandate. He admitted that it was more of a lift-and-shift. At the same time, they discovered that they need to upskill their staff. It was not just about the applications team, it also involved the network and security team. He believes that when it comes to the cloud, it is about shared responsibility.

He added that the data classification exercise is one of the most important things. As they moved to cloud, they needed the data owners to commit and understand how to classify the data.

Accordingly, he shared two use cases

  1. Moving research data from on-prem to the cloud.

That process involves understanding data classification, understanding the governance around data, and protecting it securely.

  1. Digital twins

Digital twins are replicas of the real-life environment – data is taken from real life and transferred into the digital model. With digital twins, organisations can do simulations for analysis.

In conclusion, Jon advised delegates to understand what their businesses need, look at what their data is and why data protection is required. Organisations also need to leverage the cloud and overcome the data migration challenges to truly accelerate their growth.

Power Talk / Interactive Discussion

Following the presentation, Mohit moderated an interactive discussion, featuring panellists Andrew Bell, Partner Segment Lead-Storage, APJ, Amazon Web Services, Kevin Ng, Director Government Digital Services (Central), GovTech and Raymond Goh, Senior Director, Systems Engineering, Asian & Japan, Veeam.

Kevin Ng, Director Government Digital Services (Central), GovTech

The first poll asked delegates the percentage of their servers/workloads that had at least one unexpected outage (even an unplanned reboot) within the last 12 months. Most (60%) of the delegates indicated that it was less than 5%. About 35% of the delegates indicated a 5% – 20% of their servers or workloads had such issues while about 5% had more than 50% of their servers or workloads affected by at least one unexpected outage in the past year.

Andrew commented that the risk of downtown is something that everyone is looking into.

Raymond added that there is a concept called availability gap, which is defined as data loss or data protection and not being able to meet users’ requirements. He distinguishes between the availability gap and others; a service availability and protection gap refers to when an organisation can get the service up and running.

The second poll asked delegates how confident they are about recovering within SLA from a disaster, disruption, ransomware, and corruption. More than half (57%) are fairly confident, while the rest of the delegates were either very confident (19%), not confident (14%) or not sure (10%).

In response to the poll, Kevin shared that organisations must understand what their service level agreement is and whether their SLA is in a traditional application.

Andrew observes that the options require different strategies:

  • Disruption is about leveraging the best services available to organisations
  • Disaster is a widespread term, and it concerns recovery
  • Ransomware is about stopping others from getting in

He believes that it is a safe stance to assume that organisations will get hit and to do their best to ensure that the strategies are in place.

On the note of attacks, Raymond remarks that traditional attacks are still going through – email scamming etc.

Mohit added that scams have been on the rise and last year, Singapore paid out $660 million to scams.

Reflecting that threats will always be there, Raymond believes that the only way anyone can feel 100% safe is if they interacted through a bubble. However, that is not feasible. GCC 2.0 aims to give users the confidence to use cloud more natively. For Raymond, the old method of protection is like a country’s production, like the customs. However, since the pandemic, the world moved towards individualised control, located at a particular shopping mall or hawker centre. GCC 2.0 follows this approach.

On how confident they are that their organisation’s data/workload can move securely across platforms/cloud, over a third (37%) felt fairly confident, while the remaining delegates were not confident (27%), not sure (27%) or very confident (9%).

Raymond believes that the technology has evolved, whether it is from the security standpoint, or encryption level – where the data is in transit or at rest. He opines that it is well-develop and matured enough to build a confident path that organisations can take.

For Kevin, the key is to have your identity locked down and secured.

Andrew added that while organisations are concerned about their capability in moving the application or data or workload to another platform, it is also important to consider all the tools that support it – data protection, policy management, and firewall security.

On the topic of hybrid multi-cloud, Kevin advocates relying on vendors because they are proficient in what they are doing. Mohit added that the main challenge that organisations face is the skill and the expertise.

Raymond feels that in considering cloud adoption, organisations need to decide if they need to keep some on-prem and or on-cloud. To do so requires a blueprint. For him, it boils down to capabilities – the technology available along with training to upgrade the skillsets of employees.

He also believes that part of the work is in building confidence towards embracing hybrid or multi-cloud technology – building the culture of getting into cloud, such that people find it safe to get their data into the multi-cloud world.

With regard to key concerns in delegates’ move to the cloud, most (64%) were apprehensive about security and governance. The remaining delegates were anxious about the need to re-skill talent (27%) or operational costs (9%).

A delegate expressed that talent is always an issue but security and governance are critical while another said that the transition to cloud is difficult.

Mohit believes that organisations need to rebuild their infrastructure – they cannot lift and shift because how it was running on-prem is completely different. Going cloud-native requires a fundamental change.

Andrew remarks that a lot of the studies show that cloud is more secure, but that people need to build up their confidence level, which also requires talent.

On the note about lift-and-shift. A delegate suggests first understanding one’s organisation’s business before thinking about technology. Only by understanding the organisation’s needs, can one make a decision about whether to adopt a cloud-native solution.

On his thoughts on how to embark on the journey of cloud adoption, Kevin shared that the premise of GCC 2.0 is to reduce all controls so that those who are familiar and capable can move faster. When it comes to policy, there are two camps, he opines.

One camp claims that it is changing too fast, while the group that says that policy is not changing fast enough. On that note, Raymond agreed that the move to cloud does involve a balancing act.

Raymond pointed out that there is often a misconception about the move to cloud as lowering costs. For him, it is about operational efficiency and not necessarily about cost.

Mohit asked Kevin to share advice on how organisations can begin their transition to cloud. In response, he urged delegates to decide where they want their data to reside. Is there enough classification to put everything in the cloud? Or should organisations only store data-at-rest on-prem and tap cloud for data processing?

The final poll asked delegates how important is it for their workloads/data to be portable and able to work heterogeneously between on-premise and cloud environments. An overwhelming majority (76%) found it very important, while the remaining delegates found it somewhat important (12%), not important (6%) or are not sure (6%).

For Andrew, whether it is data classification or modernisation or migrations, it is important to understand what the business driver is and to have a long-term goal. While he believes it is absolutely critical to be portable, he suggests also taking a long-term view, and starting small. Lay the foundations and set the strategy, moving little by little in the right direction.

Raymond found it extremely important as well. He believes that it is vital to decouple data so that they are portable whether on-premises or on-cloud.

Conclusion

In conclusion, Mohit thanked everyone for their participation and honest sharing. Automating data migration and protection is a process that will benefit organisations in face of a tremendously fast-paced world, he asserts. The transition is a journey, and organisations must recognise that partnerships are the linchpin to success.

He encouraged delegates to keep the conversation going and to reach out to the experts if they have any queries.



Original Source link

Leave a Reply

Your email address will not be published.

four + 6 =