No Clocks

No Clocks

2680 bookmarks
Newest
Getting Started with Julia for Actuaries
Getting Started with Julia for Actuaries
An overview of the basic tooling and packages available for the Julia programming language, with a focus on why this is of interest to the actuarial profession. It covers some of the built-in language features as well parts of the Julia package ecosystem that make Julia ideal for modeling and data analysis. This is the second installment in a series of article aimed at actuaries. The first installment, Julia for Actuaries, was published in the October 2020 edition of Actuarial Technology Today, and covered the base language itself, its high-level, math-friendly syntax, and why the actuary of the future should consider its adoption.
·soa.org·
Getting Started with Julia for Actuaries
Actuarial Process Optimization—A Case for Using Modern Technology in Actuarial Domain
Actuarial Process Optimization—A Case for Using Modern Technology in Actuarial Domain
Actuarial Process Optimization is a framework for using technology to support the actuary of the future. In this article, we discuss the capabilities of new technologies and explore examples where they can be used to aid with providing strategic business direction, optimizing skillsets and use of technology, and maintaining governance, control, and risk frameworks.
Innovation in technology has disrupted nearly every industry creating mounting internal and external pressures on organizations to accelerate the adoption of their digital agendas. The COVID-19 pandemic further brought to light the need for technology to be adaptable, powerful, and scalable. Insurance companies have been trying to keep up with the pace of innovation, primarily focusing on adopting modern technology on the consumer experience front.
Many have modernized the front office processes that support application, underwriting, and claim handling. The significant benefits achieved through process optimization and the use of modern technology have been the topic of discussions at many insurance conferences and publications.
As the insurance industry leaps into the future, actuaries must not only proactively refine their roles and responsibilities within insurance companies, but also seek opportunities for improvements and optimization in their day-to-day work. This article will focus on how technology and actuarial process optimization will support the role of the future actuary as a leader, risk manager, and technologist.
Opportunities for APO
Actuaries are valuable and strategically important resources to insurance companies. We are trained both on the job and through an intensive education and exam curriculum to study and own insurance risk. As highly capable professionals, actuaries are often self-reliant, and are interested in owning all technology tools, data and processes that support their daily jobs. However, this hands-on approach can lead to performing many tasks that do not require actuarial expertise. By occupying ourselves with various high effort but low value tasks, we often neglect to focus on higher value tasks that are truly valuable to the future of our organization. Advancements in technology and the desire for lean operations have contributed to many insurers evaluating their strategic direction and the role of the actuary of the future, shifting their focus onto highest value tasks.
Unnecessarily complex and error prone ETL, production and reporting workflows. Time and resources wasted on resolving errors and tracing back complex process steps. Production and process errors that can result in misstatements and delays in reporting. Multiple unvalidated spreadsheets with overlapping functionalities. Multiple sources of information, but no single “source of truth.” Storage and processing time wastage.
·soa.org·
Actuarial Process Optimization—A Case for Using Modern Technology in Actuarial Domain
WSGI Servers
WSGI Servers
A Web Server Gateway Interface (WSGI) server runs Python code to create a web application. Learn more about WSGI servers on Full Stack Python.
·fullstackpython.com·
WSGI Servers
Excel VBA - VBA Best Practices
Excel VBA - VBA Best Practices
ALWAYS Use Option Explicit, Work with Arrays, Not With Ranges, Switch off properties during macro execution, Use VB constants when available, Avoid using SELECT or ACTIVATE, Always define and set references to all Workbooks and Sheets, Use descriptive variable naming, Document Your Work, Error Handling, Never Assume The Worksheet, Avoid using ActiveCell or ActiveSheet in Excel, WorksheetFunction object executes faster than a UDF equivalent, Avoid re-purposing the names of Properties or Methods as your variables
·devtut.github.io·
Excel VBA - VBA Best Practices
Data Modeling - Relational Databases (SQL) vs Data Lake (File Based) - Confessions of a Data Guy
Data Modeling - Relational Databases (SQL) vs Data Lake (File Based) - Confessions of a Data Guy
Data Modeling is a topic that never goes away. Sometimes I do reminisce about the good ol’ days of Kimball-style data models, it was so simple, straightforward, just the same thing for years. Then Big Data happened, Spark happened. Things just changed. There is a lot of new content coming out around Data Lakes and […]
·confessionsofadataguy.com·
Data Modeling - Relational Databases (SQL) vs Data Lake (File Based) - Confessions of a Data Guy
Read the R source! - R-hub blog
Read the R source! - R-hub blog
Ever heard the phrase “Read the source, Luke”? It’s a play on “Use the force, Luke” from Star Wars, with no definite source 😉 that we could find^[We erroneously first linked to a rather recent blog post but Robert Link corrected us in a comment that we reproduce here in case the post gets separated from its comments: ““Use the Source, Luke” goes way back before 2012, and probably even before blogs were a thing.
·blog.r-hub.io·
Read the R source! - R-hub blog
VBA Class Modules: gateway to SOLID code
VBA Class Modules: gateway to SOLID code
Core contributor to the Rubberduck project, co-author of Microsoft Access in a Sharepoint World (2011), Professional Access 2013 Development (2013), and Effective SQL: 61 Specific Ways to Write Bet…
There are popular misconceptions surrounding VBA and object-oriented programming (OOP), usually in 2 forms: VBA isn’t really OOP, so you can’t really use OOP principles with VBAOOP makes things too complicated; procedural programming is all you need anyway
Coming from a procedural mindset, that can feel like you’re dealing with several layers of lasagna. That does require a change in how you perceive the code.
Procedural design enables you to solve business problems quickly so that you can get on with other stuff. However, what if it’s so successful, that they come back for more; asking you for more features? How many changes do you have to make? With procedural programming, the upkeep is cumulative; first few feature requests are easy and put in action quickly. Next few, it takes more time and more tweaking. Some more, then it feels a bit harder and harder. But coding should not be like that! Adding a new feature should not scale on an exponential scale! That is what the OOP promises; by keeping a clean codebase, it is easy to describe the new feature and integrate it into the codebase with minimum change.
most programmers nowadays should be emphasizing writing refactor-friendly code. What do we mean by refactor-friendly? Basically, it is a codebase that is easy to change because you are able to change only pieces that actually needs to change and no more than that. That is very difficult to do in a purely procedural system.
The other important aspect to learn is that we want to make the wrong code look obviously and blatantly wrong.
Taking up on OOP principles can significantly help us with making wrong code look wrong which means it becomes easier for us to fix the code. You’ve probably had to deal with a giant hundreds-line procedure with the great wall of declarations and deeply nested code and had the thrill of debugging it and cursing while your minor change cascades into something catastrophic. Well, there’s a better way!
To reinforce the concepts, we will do a build-up starting with familiar approach and transforming it into a clean codebase that is very refactor-friendly. The benefit is that you end up with a codebase that is easy to read, understand and maintain. Because this assumes you are familiar with procedural procedure (e.g. writing small functions or routines that perform a complex task by breaking it down into small steps), we need to provide a good transition from procedural mindset to object-oriented mindset.
We will start with creating custom types, and doing work with it, then use it as a basis for our first class.
Creating your custom types You may have already used a user-defined type (UDT), which is a convenient way to create a structure of closely related properties together. You may have used it before especially if you’ve ever had to use certain API functions via the Declare statements. Let’s start with a Person UDT. We can create a new standard module and define a UDT within the module: Public Type Person  FirstName As String  LastName As String  BirthDate As DateEnd Type
One thing about a UDT is that it cannot have any methods.
when we read the code, it is easy to understand what it is doing because we separate the mechanics of the creation from the current context which just needs something created without knowing the particular details in the act of creation
There is no way for us to control the access. VBA does not allow us to create an UDT that cannot be edited once created. That is often referred to being “immutable”. So, when we pass around UDT, we are always trusting that everyone will follow the same convention we build around the type.
We saw that the UDT can be instantiated multiple times, enabling us to juggle more than one instances of same type at the same time. We wrote some procedures that interacts with the UDT to compensate for the shortcomings of the UDT such as making creation easy or managing some sensitive change such as changing person’s last name which may have additional constraints beyond just the code itself. We saw that an UDT does not really do a good job of managing the access to properties, which requires us to follow conventions that are not enforced by the compiler, which can make the coding around an UDT highly prone to errors or omissions.
The very first thing we want to do with our first class is to define the private data it needs to have to work correctly. We could start with nothing but public fields
Controlling access to methods via interfaces
·rubberduckvba.wordpress.com·
VBA Class Modules: gateway to SOLID code
Why Every Entrepreneur Should Be Doing A Weekly Review
Why Every Entrepreneur Should Be Doing A Weekly Review
Ever get the feeling nagging at the back of your head that there’s something you’re supposed to do, but you can't remember what it is? Or you know you're supposed to ask something specific to the person you're talking to? Or you've got so many open loops in that you aren't sure what the logical next step is? I used to. I don't very much anymore. I've always been a bit of the absent minded professor type, prone to getting lost in my head or on a walk. That means that I've always had a problem losing or forgetting things that need to get taken care of.
·taylorpearson.me·
Why Every Entrepreneur Should Be Doing A Weekly Review
awslabs/aws-dataall: A modern data marketplace that makes collaboration among diverse users (like business, analysts and engineers) easier, increasing efficiency and agility in data projects on AWS.
awslabs/aws-dataall: A modern data marketplace that makes collaboration among diverse users (like business, analysts and engineers) easier, increasing efficiency and agility in data projects on AWS.
A modern data marketplace that makes collaboration among diverse users (like business, analysts and engineers) easier, increasing efficiency and agility in data projects on AWS. - GitHub - awslabs/...
·github.com·
awslabs/aws-dataall: A modern data marketplace that makes collaboration among diverse users (like business, analysts and engineers) easier, increasing efficiency and agility in data projects on AWS.
Monitor data quality in your data lake using PyDeequ and AWS Glue | AWS Big Data Blog
Monitor data quality in your data lake using PyDeequ and AWS Glue | AWS Big Data Blog
In our previous post, we introduced PyDeequ, an open-source Python wrapper over Deequ, which enables you to write unit tests on your data to ensure data quality. The use case we ran through was on static, historical data, but most datasets are dynamic, so how can you quantify how your data is changing and detect […]
·aws.amazon.com·
Monitor data quality in your data lake using PyDeequ and AWS Glue | AWS Big Data Blog
haciduru/kal: This is a script that you can use to encrypt data using Rscript command line tool. It would be impossibly difficult to decrypt the data if you had not seen the code in this file.
haciduru/kal: This is a script that you can use to encrypt data using Rscript command line tool. It would be impossibly difficult to decrypt the data if you had not seen the code in this file.
This is a script that you can use to encrypt data using Rscript command line tool. It would be impossibly difficult to decrypt the data if you had not seen the code in this file. - GitHub - hacidur...
·github.com·
haciduru/kal: This is a script that you can use to encrypt data using Rscript command line tool. It would be impossibly difficult to decrypt the data if you had not seen the code in this file.